comparemela.com

Fascinating conversation on how to reclaim our humanity by bringing Emotional Intelligence to technology, and just a little more about rana. She was named by forbes and their list of americas top 50 women in tech and fortune included her also in their list of 40 under 40. She was also the cohost of a pbs know zsa zsa series on a. I. In 2018 and an allaround awesome person and really great to have you here, rana. Before we dive in i want to share how this topic ties into our mission. At iv everything we do is to help ourselves unlock our own potential and potential of people around us. That happens through learning and community. The more we learn but the world and ourselves, and the more we surround ourselves with other passionate individuals, to better well unlock our potential. When i reflect on this topic, Emotional Intelligence, a. I. , this us isnt just about machines. Its also a lot about us and the future that we want to shape and be part of, so, i want to kick off by asking you, why is this topic so important . Why have you chosen to write a book on this subject and dedicate build a business around it . Why is so it critical and what was the journey you lived through to get to here . Guest hi, everybody. Our Emotional Intelligence matters. Right . And the thesis is, technology has a lot of cognitive intelligence, a lot of iq but no eq, and if we are able to build owe q into our technology thats not only transformed human machine interfaces like how we connect with our phone for laptop, but more importantly to your point, how we connect and communicate with one another and thats been the driving force of my work. Ive been doing this over 20 years now and its really fundamentally but improving the way humans connect and communicate with each other. Host thank you. And what is at stake here . Longterm, if you get this right, get Emotional Intelligence and machines right and our ability to cocreate, with automated systems, its to get it right, what can we hope for and if it goes horribly wrong what is at stake . What do we stand to potentially lose. Guest Emotional Intelligence is at the core of how we build trust with each other as humans. And its often not the explicit kind of license agreements or forms we have to sign to build trust with one another. Its usually the implicit nuance, nonverbal stalls we exchange and based on these signals, we make thousands and thousands of decisions that involve trusting one another, trussing our colleagues, our family members, our communities, and so if you think about how that works in a Virtual World, and of course know with the Global Pandemic, how do you build trust . How do you build trust with technology . How do you build trust with one another . And so thats what is at stake. To me its if we cant trust each other, we cant really move forward as a society, and the other piece of this is empathy. Empathy is at the core of what we are as humans, and i talk but this in the become. Were in the midst of an empathy crisis. Hope with this Global Pandemic and the whole universe going through the same i wouldnt say the same its the same kind of situation but of course everybody has their own experience of it. I hope that we emerge out of that with more empathy. So empathy is another big theme and its definitely at stake. Host thank you so much. Youre mentioning two key ingredient, collaboration and life, it does boil down to trust and empathy. And when we talk but a. I. , when you say the words, i dont know about you but probably most people visualize like the scifi horror story and other scifi series, mow e more friend live characters. Painting a picture looking ahead, whatever may by, 10, 20, 50 years, how do you think, like in an ideal world, how do you think what is the possibility here . If we play our cards right, could machines be as full of empathy and trust building as humans, like is that possible . Or is that something that we should absolutely drive for . And i want to ask you if we dont dot that, what is the other path and the danger of that. Guest if we do this right, i envision basically emotional a. I. Becoming ubiquitous. That means the way we enter fate with our technology be it your car or phone or your whatever device it is, we interface with our technology them say way we interface and connect with one another, through conversations, were already seeing that with siri and alexa, et cetera, through perception. So were starting to see our devices have eyes in form of cameras, et cetera. And then up matily through empathy. And so i think this will manifest the framework i use to kind of project that into the future is that its a tool. In the same way that if you have a hearing problem you go out and by a a hearing aid or if you dont see very well, you have vision problems you use contact lenses or eye glasses. Well in a Virtual World which a lot of us are in, you dont have the same quality of Emotional Intelligence. Your eq is lower so i think of these tools as automaticking our eq. Augmenting our eq. Its more of a partnership. Its not us vs. The social robots. I think social robots can help us do our jobs better and so im excite about that. If it goes wrong, and i think we can talk but the ways which this can good wrong. For me its around respecting privacy and consent and being a trusted partner so that people feel comfortable sharing this very personal information. Thats the concern i have. I think we need to think about where these technologies can be deployed. We can kind of doubleclick into that. And then of course bias. How do you make sure that these technologies are not biased against a particular group of people. Host absolutely. Youre saying how people get tools like hearing aids and so forth. People also buy pets. So theres another element to that, come companionship, being a fellow like theres also like humans dont just need solutions. We want to feel good and it seems like there is ant opportunity here where people can be assisted by machines essentially in a broader way. Guest can you imagine if again, spending so much time on our devices if there was a life coach or an equivalent of sirery that gets to siri that get tuesday know you really well and can flag you seem at extra stressed. Maybe you need to take do a yoga class and can suggest that for you, or if it can say, i he notice you were super unempathetic on that last zoom call or you were kind of mean or rude, whatever. Maybe you should consider doing x, y and z, or can flag things like depression, and even suicidal intent. A learning companion. So if youre taking that online class and you dont have the grit to keep going, it can kind of cheer you on. I think theres a lot of opportunity for this idea of a coach that helped you be productive or more healthy or more connected. Has your back. Host excellent. Thank you for that. So, share with us the current landscape. Where are we today, whats most cutting edge in realtime, and what are are you most excited about when you look at the different pockets of innovation in this space, whats currently to you seems like the most exciting, most gamechanging technology to help. Guest i want to maybe start by explaining the core technology. The way humans communicate only 10 is in the actual choice of words we use. 90 , 90 plus is nonverbal. That it your facial expression, hand gestures and your vocal intonation. So companies in our space, we are very focused on facial expressions, so mapping, using Computer Vision and Machine Learning to map all of thieves facial movements interest these facial movements into fund the state of the person, happy, sad, look tired, am i driving a car and falling asleep . All of this cognitive and emotional states are key. And so where i would say then we are started come binning vocal intonation. Are you monotone, excited in the way youre talking . So all of those nonverbal skills were able to now develop algorithms to detected them and we partner with a number of industries look the Automotive Industry to make sure the driver, for example, is not distracted or falling asleep and then the car can take some action to make sure the occupants are safe. So thats one industry were very excited about and spending a lot of of time focused on. I would say given the again, where we are right now, a big area that were exploring is around Virtual Events and how do you create a shared experience when were all just communicating online, and the example i like to give, which is very if we were all in the same room right now, right, which we would have, if the pandemic wasnt happening, you would would riff off the energy of the audience, see when people perk up based on question and probably try to build on that. Its so hard right now for me because i cant see the expressions of the attendees, and i find that like really isolating. Its not its a different kind of experience, and its less of a shared experience, but imagine if emotional ai was existing and we could see a realtime kind of engagement chart, so if we if you said something it was funny you would see the smile curve go up. So i think theres ways to bring these technologies to create more of a shared experience. Host absolutely. The key point here is, given that only 10 of how we communicate is termed by the content of what were saying and 90 is both gesture based and so forth, the question is how to use technology to better build on that 90 of how we communicate, both with each other and also the technology we use. So, how what advances right around the corner . What can we hope to see in the very near term, not like 20 earlies but the very near term so im using any google home to turn off the lyings and play song. And i know it remembers things about what i requested before but i dont feel like i have a shared history with this machine. I dont feel like it really knows me. That do you think in the near term is going to be the most important changes, not just in terms of what the technology does but specifically how we live and how our the way we live is augmented. Guest i imagine a lot of those devices and conversational agents, once they have a little empathy and emotional ai they move from being trapping transactionam to more conversational for real. For now the way we all use these devices we ask alexa or google home to do something for you, play a usmexico, order blah blah law and thats it. Theres no back and forth. It cant say oh, you look a little tired today and you say i didnt sleep well, maybe i ordered some chicken soup for you. Theres no this back and forth and the only way you get there is if you start having a sense of history of a personal profile, what is your baseline so when you deviate from it, it can say, you look a little off today. Or if could be you look purpose sited today. Havent going on . Could go either way. Doesnt have to be negative emotions. But the has to be this sense of it knows who you are as a person, and that only comes with a little bit of an Emotional Intelligence and that unlocks all sorts of applications beyond the transactional. It can really be a conduit for helping us do all sorts of things. Host correct to say a major limitation today is simply the recording piece of this . Because with smart watch, an apple watch, even up i dont do anything its still recording how many steps im taking, my heart rate and that kind of stuff. So theoretically if i was wearing a device or if there was devices around me that measured my facial expressions, gestures and all that, but also maybe measured chemicals in my body of like so would you say that a lot of the limitations are around the recording part of this . And then also like what else is in the equation . What if data captured, communicating with you in a way thats not super annoying or disruptive. Guest exactly. Two parts the sensing part and the action part. And we are the sensing part but of cower we collaborate with our part nor like automakers to make sure the sensing part isnt annoying. The limitation for the sense, part depend little 0 then situation and the context. Ill come back to trust. Think its really critical that there is a level of trust that people know their the user knows this data is collected doesnt have to be roared, itself can be processed on the flee and being done to create a better experience for you. Has to be val knew return for you and thats where a lot of the innovation is happening. Ill go back to Automotive Industry where we spent a lot of time. Car companies putting cameras in the vehicle. The data never leaves the car and never gets recorded. But its looking for things like signs of fatigue, signs of distraction, and the Value Proposition it will be a safer experience. So i think thats very clear Value Proposition. But its a dialogue. How much do users and consumers willing to whats the word be okay with thiessen are so these sensors being around us. Host you have a second major book, at the end of that his whole his prediction and im paraphrasing here his prediction is that because we find it so difficult to make decisions as humans, like we whenever we can wed rather not make a decision, rather be just kind of given direction. So instead of learning how to get to this place, myself, ill just keep looking at google maps like it tells me. He claims in that book that because of our tendency with that we will give up all the information so that our apps can Tell White House to date, what to eat, what business decision to make. Could be so many things. And he the way he phrases it, he sews it as inevitable. What job to choose if you are given two offer. And then another thing to add to this, because were stuck at home, i was looking at getting a ply ton and theres a mom payment plan so just connect your bank account to this and we seek your transplantation transactions and i wont do that you but itself get outside the financing youll do it. Do you believe is that your prediction that now its just restricted to card but ultimate live if it was connected to the cloud and people got 10x value, pretty open to giving all access to their Life Technology to get i think youre getting enough value and you trust the technology, and of course, by definition the creators of the folks behind the technology, i moon i think we have seen tons of examples where people are doing that already, and if theres a convenience and theres value and a better experience, yeah issue think people would. People are already using were already giving up so much data, and i think a lot of people didnt realize until the recent tech backlash how much dat were sharing. Talk about power asymmetry. Unfortunately so far only a small number of companies or governments have had access to all this dat and we as consumed were not really beneficiaries of that data. And i think there needs to be a rebalance of who has access to this data and who is getting value out of it. Host if you could help crystallize for us, what is holding us back. Guest really complex problem. Peoples emotions and behaviors are actually pretty complex from a Computer Vision standpoint. We all look different and express in different ways and how do you map that to an actual understanding of the persons face. I could be smiling all day long but broken from the inside. So its a complex its not just, oh, if you smile youre happy. If your raise your eye brows youre surprised. Thats very naive way an oversimply nick indication of humans so we try to over simply nick indication of humans and it will take time to code for these states and then the third is like just consumers have to be onboard and so the value has to be there and were still experimenting with what are the right applications of this. I was on a call earlier this morning, prepping for an event that is about the future of relationships, and it actually we talked but dating apps and how this technology could kind of redefine what dating looks like, especially again given that a lot of this is happening virtually, and very interesting. If i had a dating app, that could say, based on your character, but also based on the chemistry, if you meet this person in the real world, you are going to hit it off. That would be id pay for that. We dont have that right now. Host right. So we have a related question for this from our member, helen. She asked, how about future eq enabled therapy or even for personal and Career Coaching . Are there knew knew nuances and i want toed a a little bit to this question. So there was a great uk based show, that came up a couple years ago and there was these like android like a. I. That looked just like human but a different eye color and they had actually everything about the world was the same but one scene that completely blew me away, the main two characters are going in for marriage counseling and its an a. I. Marriage counselor and typically if your before seeing that if you asked me whats the very last job that robots might take i would sigh Something Like that. Because its so emotionally interconnected but in the show what happens in the scene is the couple is having an interaction and this humanlooking android, like a. I. , is kind of like, okay, well, based on 18 billion data points with these actions theres a 97 probability this thing is causing that thing and therefore you should do this with this percentage of confidence. You see that and youre like, i would whoa you use a human one again. No, its everything. So, both with helens question and that perspective, what would you think anything that a eye wont be able to replicate or just like emotional and iqing news thats an awesome question. Where do we start. I think this example is like really fascinating because i think theres something really magical and amazing and almost an oxymoron in that you can quantify emotions and bring a data driven approach otoe something we feel is so irrational and not quantifiable, but is quantifiable. I like your 18 billion data points example to make decisions. The other thing is we do make a lot of our decisions based on our emotions. We make emotional based decisions in terms of where we want to live, career, partners, what have for breakfast, influences or health and our copied of sense of mental well being, how well we sleep. Our emotions just like integrated into our lives and people dont realize that. So it makes sense to be able could quantify it. Back to the counselor case. Theres Fascinating Research out off usc where they brought in posttraumatic stress disorder patients and half of the patients saw a human psychiatrist and the other half sow an avatar and found that that patients were more forthcoming with the avatar, just shared a lot more, just cob identifiedded a lot more, because they perceived the avatars to be lest judgmental than humans. And i just find that kind of sad, right . So will these robot counselors replace human counselors . Not anytime soon because theres a lot of work to do but a partnership. Maybe for the lest critical case. Lets take covid for example. You are getting tested and walk interest a clinic or hospital, maybe the Frontline Health workers are actually social robots, and they say, oh, hi, your dont feel so well, let me do a number of tests on you and if its critical you get moved to an actual human being. If its not criticalot got a little empathy you feel betterror, turn around and good home. I dont know. Host the ptsd example you gave, a great one. I can see how like the nonjudgmental nature of nonhuman avatar counselor might be desired. Its also the fact that the avatar wont have a bad day. The of tawil be objective, hat no agent, like doesnt deafblind it just does what it does. What it doesnt have is necessarily a shared experience and its not something that we can have a relationship with. So, theres a lot of jobs, even currently, someone has their lawyer, a very personal relationship. Their banker, its very personal relationship. Even though 95 of the thing yo do with your banker or lawyer could be automated today. So, do you think mack like the ultimate outcome could be that partnerships, so like you still go to a human ptsd counselor but the human is backed by the a. I. That gives them the super objective data driven stuff, but humans will city pay for the human to still pay nor human to human relationship because they want to know their therapist and still maybe in the theres a big data and objective nonemotional things driving what that human counselor is saying. Guest i totally agree with that, and this kind of a. I. Driven could be a conduit for the human counselor. Cant get my counselors ears 24 7 but man theres situations where i can access that a. I. Counselors proxy until im able to until such time we have seen examples, actually a robot, an m. I. T. What is david asking . Host i get to his question. Guest david is oh, this is a good question. There is okay so david there is actually a social robot out there called mabu, its an m. I. T. Spinout but theyre based in San Francisco and the robot is designed to help terminally ill patients. You get sent home with a robot, cute yellow robot and the robot makes sure you take medication ted same time and check inside daily and listen out if youre not feeling well, both physically and also mentally and would flag that to an annual nurse or clinician so they can enter season. Whats the alternative . You cant extend a human nurse with every terminalie ill patient and thats where its a partnership. The next best thing to actually having an amazing human being with you the whole time. And also i said this robot is out there. There is a lot more work to be done on the robot to be eexecutive . Absolutely but there are videos of patients talking about their relationship with mabu and its absolutely fascinating, the humans really it sounds weird but you do build a connection with these devices and it does it can sometimes save the day. Because you feel like somebody is on the lookout for you. Host absolutely. So, rana, thats a fantastic example and just to underline the question david was asking specificalliles is, like, where is the line between reality and Science Fiction . So but the way i want to ask you that is, anything can be more scifi possibilities that were imaginingout think is, theres a line. Some things we never happen. Just not possible. Oh do you think since youre in this industry at the cutting edge, theres nothing that we talk but that isnt going to happen. All going to happen sooner ore later. Fact versus fiction, any line to be drawn or inevitableity well get there. Guest i think where i have a hard time believing the scenario where we have this super power agi, or agis that will take over the universe and destroy all human beings and this notion of exsend shall threat we designing devices. Why whoa they turn just pull the plug out. So i thats where i feel like thats not really the entire framework that is humans versus a. I. , i dont subscribe to that at all. Think of these things as tools that augment our abilities. We get to design them, get to develop them, decide how to deploy them. Were in control. Now i dont trust humans. Thats where i would its not about the technology its who is designing them. And do we have a shared set of core values . We dont. Some countries care out ethics and some other countries couldnt care less. So thats where the problem is. Host absolutely. So, building off of that. We have a great question here. Im betting this is the person who is asking the question im betting you have a fascinating perfect on generaller roles and expectations. A wok exampletive in tech working on making tech more emotionally intelligent, will a. I. Evolve to react to different genders, rays, nationalities, differently, any dangers of discrimination there . Guest that is an amazing question. So, before we went live, we chatted, like, if because we both have a middle eastern background. What if an egyptian dedesigned alexa. I think so my experience as a woman in this industry has been really interesting. My cofounder is an m. I. T. Professor who started the whole field. She wrote the book and she eadvantagized back in the evangeliesed computers need to have Emotional Intelligence if read the book in egypt and its traps follower my life and thats why im in the u. S. And doing what im doing. Got so inspired by the book. My favorite story is when we were at m. I. T. And deciding to start aeffectiva. We went on an investor pitch trip and we had a ton of investors line up, and it was just like really interesting because there we were, two women, scientists, at the timed use to wear the hijab and i was clearly we evoted the e word. We did not use the word emotion at all witch dance end around it its sentiment, affect, surveillance, arousal but not emotion, and fast forward ten years, i feel the world has changed and theres much more realization that there is a role for emotions and Emotional Intelligence, and i see that from both men and women, which is awesome. So i think its changing slowly. Now in terms of whether these technologies should respond differently to men versus women, we have 5 billion facial frames that we have collected with peoples consent from 90 countries around the world and we already see there are gender differences in how people express emotion. There are cultural differences and how people express emotion so in the u. S. We find that women smile 40 more than men do when theyre watching content. In france and germany, its only 25 more, and in the uk we found no statistical signature indians between men and women. Early days. We need to dig into that more. But it is really fascinating. Host thank you, rana. I think it definitely deserves a youtube skit, alexa in different countries. Heres a great question from robert snyder. Two questions on the topic of, one, the marriage of eu and ai and getting that right. The question one, other than National Governments what institutions are in place or should be in place to keep things healthy and encourage good actors andman nice bad actors and will governments be heavy handed and oppressive or lighthanded and resemble peer pressure. Guest so the first question is i feel very passionate that as Ai Thought Leader and innovators and Business Leader we need to be he for front of this forefront in deploying ethical ai. We are part of the farmership on if a eye a consortium start by amazon, facebook, microsoft, number of years ago and invited other stakeholders Like Amnesty International and aclu, and then a number of startups like atechiva and im on the Faith Committee fair, accountable, transparent and he he can equite and the push us for thoughtful regulation. Im not going dirk think we should really enforce legislation that tick dates now where the technology can be used and what data can be collected and who has control of it. All of that. Im against kind of a blanket lets not develop this technology, because i think theres so much potential for it to do good in the world. To answer the question, there are organizations in place and the idea to collaborate closely with legislators to ensure this has done in a thoughtful way. The challenge is, not all governments and not all yeah, not all nations share the same set of core valued. For example, in our space, 2004 biggest competitors or two chinese candidate that have raised billions of dollars of funding and they have access to a ton of data because its centralized through the government. And theyite for all sorts of applications, including surveillance, and profiling, which we are absolutely not going to do from day one we have turned that industry away and that use case away, including millions of dollars of funding. I believe ultimately our approach will win in the global kind of again, back to peerunder and consumer advocacy, another reason why i wrote the book. I wanted people to i think its key that the everyday consumer has a voice in how a. I. Is developed and deployed in same way we have a voice in fair trade or organic food or going green if think the same thing has to be had around technology and ethical technology. Host here is a mineblowing question. A great one. Cant wait to hear your response. My concern is a lot of your examples are stop gaps or what a sew sattal weakness or issue. For example no one should die alone and be terminally ill alone. These technologies risk depleting our relationships and connection by ignoring the dealership human issue how do you consider the ethical signed of building stop gaps and what part of the focus of Emotional Intelligence is greater empathy. Guest thats a wonderful question. So, i like to be a realist and pragmatic. Is a version of the use where we do away. Technology and go back to face to face communication and and just kind of deeper empathetic connection maybe if dont see that as being realistic, its not where we are headed in given everything that is happening today. So if you expect the technology is part and parsel of where we and are if you accept that unfortunately not everybody has access to Mental Health care, not everybody has access to healthcare. No everybody has access to amazing education. Grew up in he middle east. Was summer lucky i had excellent education which i can connect the dots why can be doing what im doing today, because of my education, bus im lucky. So, can we use technology in a way that democratizes access to healthcare and education. The i think technology can be part of the solution. It does not have to be dot not have to take away from our connection. It should be able to augment it. Host thank you so much. Rana. On a related note, anna, ask the following question. If we start to rely more on a. I. To connect, how will this affect our ability to empathize with and understand our fellow human beings without to the a. I. Tools . Could this get so addictive that just like i cant find directions on my tone go anywhere, will it be like that . If people get too hooked on this, will it actually diminish our able to do it without the a. I. . Guest we should thats a concern. I worry about the opposite situation where if were all eight can connecting online andt base owed text or social media platforms that do not incorporate nonverbal communication or eye contact. I worry bit my son who is 11 and spends a lot of time on this twice and individual row gaming and doesnt need to practice any nonverbal communication when he is doing these things and if you dont use a muscle it atrophies. So, unless we redesign these experiences to incorporate these nonverbal communication, i worry that we lose them altogether, and we dont get to practice empathy. N a virtual environment. So i think theres an opportunity to be creative and reimagine what these what social and Digital Communication could look like in a way that allows every time im interfacing with a device, be it just the device or as a proxy to a human or group of humans, if i can be practicing making eye contact, which we dont actually in this situation, but if i can be making eye contact, if i can be motivated to be expressive because thats going to make the connection so much more authentic and real, well use these things. He hell use and develop those skills. If we dont theyll good away. Host it a fascinating point. Not like we should be worried but the technology the future technology making us its more about the current technologys limitation because its not getting across our full humanity. Its better they can see you today for example than just audio. And if we could also tell us more things about you and you can tell thing beside out even more than whats you can see. That would also be really helpful. Exciting to think this keep this in mind. Heres a question about data which is like you said, like sensing the problem, what the d george asks, is there any company or Organization Currently collecting data that can be trusted to use that for our or my benefit rather than their own benefit. Guest i think there are examples of act or companies that yeah, i think there are some examples of that but just this in balance. Im sure if we asked the same question, to jeff bezos or Mark Zuckerberg im sure theyll say we collecting dat for your own good. Its an interesting question. Probably a spectrum of trust. Host all right. Absolutely. Rana, one question i wanted to ask you also lake if you were to draw some red lines, what technology should never be able to do or allowed to do, what would those red lines be . One that comes to mind that we should like, id be curious give an example and you tell me what else or could be a or should be a redline. Given that people just based envoys mail, textmessages or emails hand over the Bank Account Details to bad actors because its could convincing that they hear, if this Technology Get razz yell good and its indistinguishable from a human and not just a human but the most manipulative human ever that can really get under your skin, like i would probably think thats totally a red line, something that is so convincingly human that and that is used for certain purposes. Thats one example. Want to know what you this heres a red line that should never be crossed, no matter how capable we get. Guest for me, i do not think its acceptable use toe use this Technology Without peoples consent or opt, in. For us that meant that security or surveillance or lie detection are industries we do not play in and in 201 we got approached by an agency that wanted to give us 40 million of funding, and authorize as a lot of money and we were two months away from running out of cash so it was hard to turn away from that funding but i thought about it and it was not in line with our core values etch we want to be your trusted partner for the dat and that felt like a violation of trust if people didnt know you were checking that data, dent know who you were using it, there was a lot of potential for profiling and abuse. So, we just stay away from that use case and i think that is an example of a use case that just reaches the spirit of why we would want to do this, which is bridge a communication gap, not abuse it. Host would you make it a requirement that anytime youre interfacing with not a human but this a. I. , you should know thats what is happening in should be indicated straight up, like if its a phone call or any kind of technology, this is not a human . Even though guest we have had these i dont know if anybody is actually researching this bum i would be so curious. If you know somebody looking into this let me know. What if you what if you are chatting with an a. I. A counselor jew dont notify the counselor is human or machine or a hybrid. And what does that affect how does that affect yeah how much you share, the efficacy of it which is more effective. Theres a lot of work to be done there and, then, should the a. I. Disclose its an a. I. . Or not . Guest a good question. Of you could share with us practice dipeek are in our day to day lives, at work, and then as a country or just like humanity at large, what are super practical actionable next extent start with the personal level. What are the ways in which we may be already able to leverage these technologies better in our personal lives, and thing iowa have seen that are consumer oriented and helpful already and then the next level for businesses and organizations, what can we leverage and then at a governmental or international level, what more could be or already be used and implemented based on what guest okay. On the personal basis, applications that track your moot. Theres a couple out there. But if done right that can be powerful. It would track your mood and then maybe sync up with your calendar and highlight days of the week or meeting that always leave you in the worst state. Could do a like the fitbit for your Mental Health. So thats and examples of that based on physiology that tracks your heart rate can use the camera sensor on the phone and tie it to an level of stress or happiness but theres no to be done there thats one example on the personal level. I would put in a plug for an app i love called day one. Its a journaling app and what i love is that it has it isnt really exactly emotional a. I. In terms of what im developing but i find because it allows for all sorts of different types data and tagging, in a way i use it as a very a way of expressing my emotions and an organizal level, we work with a third of the fortune 500 Companies Worldwide and theyy our technology to quantify how people respond to their con at any time. So if youre creating a short video reel and whatever the message is, the cause is, you often dont really know where how do people emotionally engage with that content. So thats one way. There are use cases around and then what is coming around the corner, and applies to society at learn is out motive and safety. Automotive and safety. Were collaborating with some governments where theyre deploying these robo taxis or buses or other and they want to make sure that its a safe environment, theres no violence, they want to make sure that people are having a comfortable experience. So, thats all like all part of it. And then telehealth. Want to end with that. We can Leverage Technology to flag people who are struggling, flag depression early on, flag parkinsons, autism. Theres a lot of potential there. Host thank you, rana that was great. Last kind of question to bring us home, i want to ask you, tell us about your book, girl decoded, scientists quest to reclaim our humanity by bringing ei to technology. What led you to write this book specifically. What are the key themes it focuses on forks everyone who might be interested. This isnt just like a technical manual. This is very much a moment moyer of your story so love to hear about that and then would love to ask you what are other causes and endeavors tell us more but your company and causes you care about and how to best support you. That would be great. Guest awesome. So girl decoded came out a few days ago, check it out. I also narrated the audio book so if youre into listening to books, to get audio books a check i out. Its so we started this has been in the making for the years. When i first started the idea was to write a book about emotional a eye, lie the conversation were having, technology think applications, the ethical impreliminary indications, and impreliminary indications and i met before the publisher and question ed doctor at Penguin Random house for lunch and he said caveat. Im not taking on any new a. I. Books. Theres so men out there. I was like, oh, god. He was like tell me your story. Group up in middle east and a pretty conservative family but they supported me education and came to england for my ph. D and thenland at boston at commit started the co and im tech ceo so he said thats the store. So pivoted into this memoir with the mission offed evangelizing what emotional a. I. Is and using my unusual past and kind of hough i had to overcome a lot of cultural and societal norms. Both growing up in middle east and being a technologist in a very male dominate society, and in the hospitals this story will inspire other people to forge their own path and find their voice. So, yes, thats the book. I learned learned a learn a lot. Its been a youry journey of decoating technology and decoding myself. Im a work in progress. One of the of course, check one thing im passionate about is paying it forward. We hap amazing Internship Program which we having to rethink because of covid but we usually have high school students, undergreat, post grad over the summer for an amazing Internship Program to engage these young people and again thinking through what do we want this technology to look like . And then im also so im passionate about diversity, be it gender, ethnic or age. And im part of an organization called, all ways, that supports both fee maim found female founders and female funders. So if youre interested in any of these please let me know. Host we have been listing the links and ill include this in the thank you message that guess towel everybody after this event. I want to share my key takeway and then ask you to share maybe a final word of wisdom with us. If you left us with just one thing to keep in mine as we go doubt our daytoday lives as best we can during this pandemic. Everything from your learning you want us to keep in mind that would be great. So before that ill share my key takeway. I lo the title your book, girl decoded. What you said about its been a journey of both selfreflection. I think that grappling with this theme, eEmotional Intelligence, a. I. , notice so much about talking just what is at the technology and what is its really causing to us dig deep and figure out what does it anyone be museum how do we share that, how do we maintain what is so special about it and doubledown on it, not necessarily to replace it. I think this is why a. I. Facinate people so much. Not because they care about how circuits work. They really care what is human, then, if thats what a machine can do. Thats key takeaway and i feel like this can make is feel more human more often that not. Thats the key takeaway and i want to give it over to you. Guest to build on what you said, my mantra is least Humanize Technology before it dehumanize us and thats the core of what i do. I think my last word of wisdom here is just to lead with empathy. Whether youre lead inning your teams or organizes or your families or communities, begin with empathy. Host i love that. What great call to action. You are wonderful today. So thank you so much for joining us. Now on cspan2s booktv, more television for serious readers. Welcome, everybody. Im steve coll the dean of the Columbia Journalism School and pleased to welcome you to tonights conversation with the 2020j. Anthony lukas Prize Winners amount i sorry were conducting this conversation virtual live rather than in the world room at columbia and i hope were back together again next year in the

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.