comparemela.com

Actually stopping when it should stop, its taking the curve fairly comfortably and seems to recognize [inaudible] they build a degree of comfort. We learn more about the move to selfdriving cars and professor rajkumar at the lab where his experimental car is kept and worked on. Host so dr. Rajage kumar, whats your job here at Carnegie Mellon university . I am a professor of robotics and engineering. Host howd you get into that . Guest turns out that i did my postgraduation studies, did my masters and ph. D. At Carnegie Mellon, then i joined ibm research for three years or so because i came back and i came back because i like9 cmu and pittsburgh so much. Host what kind of things do you work on . Guest what are called embedded systems, thats the technical term, these are things that embed computers inside them, but basically different devices. For example, your smartphone is basically kind of a sophisticated device with computers inside them, but you really dont [inaudible] take a its, for example. It has a computer inside of it but is meant to be a television. You think of it as basically that device. A toaster that you use on a kalely basis that costs daily basis that costs 10 has a teenyweeny computer inside it. So basically these are embedded systemses, embed computers and act smart but its really a dedicated device of some other kind for the end user. So ive been working on embedded users since my doctoral days, if you will. And it turns out that intelligent vehicles that drive themselves are are a classical example of embedded systems. So a car a car that transports people from and goods from point a to point b. It just doesnt have computers embedded inside them. Host how did you get into the business of Autonomous Vehicles . Guest great question. I have been working with General Motors, the carmaker, since 2004. General motors has been working with researchers in our Department Since the year 2000, so when i started working with gm in 2004, started applying my expertise in embedded systems into automobiles. And then in 2006 darpa, the Defense Advanced Research agency, the research arm of the military announced a competition. The name of the and competition was the darpa urban challenge. The intent was vehicles that drive themselves without anybody in the car, they need to drive for about 60 miles in fewer than six hours driving in an urbanlike setting with other selfdriving vehicles as well as humandriven vehicles and following the same rules of the road that you and i have to follow on a daily basis. So more that competition gm became our biggest partner and sponsor. We had about 23 other sponsors as well, but gm was the biggest of them. Because we already had a very strong working relationship with gm, i became an integral part of the team from cmu that worked on our vehicle which ended up winning the competition as well, a 2 million prize. So thats how i got into Automated Vehicles. And when our team from Carnegie Mellon won the competition, gm who was the biggest sponsor said, hey, our team actually sponsored the winning team, and they said because its about urban driving, it leery has implications clearly has implications to the passenger Consumer Market segment and started a second lab on campus focusing exclusively on automated driving, and ive been running that lab as well since its launch. So thats how i got to do Automated Vehicles and started working with gm. So our relationship continues to be very strong and loyal. Host so does gm own the technology that you develop . Guest so the technology that they sponsor, i guess they are very grateful more their support, is actually owned by Carnegie Mellon university, and we have some licensing elements with gm but they do have access to parts of technology. And we learn more about the experiment bal cadillac as we got ready to take it for a drive. Host okay. Now, i want to start by looking at. What is this monitor up here on the topsome. Guest so thats one of six laser sensors. So thats one on the forehead of the car. Theres one in the bumper here, right there. Theres a third lidar on the other bumper as well. There is one behind the side back window of the car, and theres one on the other side exactly opposite to that. Host what are they reading . Guest so what theyre doing is they are actually sending out multiple laser beams, and then when the laser beams basically hit an object, they bounce back and come back to the origin, the transmitter. And then because we know the speed of light, you can actually calculate how far that is. And because already multiple beams that create scanning as well, you can actually get a profile of the obstacle that you just encountered. And because we have lidar all over the car, the vehicle actually knows whats happening around the vehicle in realtime. Host now, dr. Rajkumar, is this car communicating with anybody but itself, mig but itself . Anything but itself . Guest the vehicle is capable of communicating with properly equipped traffic lights, traffic signs and other similar radios. We like to call it a connected and Automated Vehicle or connected Automated Vehicle, tav in short. Host we see some cameras inside the car. What are these . Guest yes. So in addition to the six laser sensors that we talked about, there are three cameras and six radars as well. There are two cameras, one on either side of the Rearview Mirror inside the cabin. One is actually pointed downwards, is looking for landmarkers on the roads. One of the cameras is pointing upwards, looks at traffic lights so you know the status of the traffic lights as youre going along. Theres a third camera at the back of the reek for backing vehicle for backing purposes, you can see whats going on. And there are also six radars. Theres actually one behind this cadillac emblem. We replace the met a aloe go with a plastic logo so the radar can see through the plastic. Theres a radar behind this bumper as well on top of the lidar thats actually behind the bumper. The bumper is made of plastic so the radar can see through the plastic as well. Theres radar on the other side of the bumper, and theres two on the side right next to the lidars as well, but you cannot see from the outside, you dont see from the inside either. Theres another radar at the back behind the bumper also. Host is this car seeing 360 degrees . Guest it is seeing 360 degrees all the time as opposed to humans where the turn of our heads cant turn all the way around. Host so, professor rajkumar, when you get in this car, whats different . Be whats different in the look than a regular cadillac . Guest so we tried to make this car look normal on the outside and the inside. On the inside it pretty much operates and and looks like a normal car just like you put a rental car at the airport and then pick up the keys and then the layout looks slightly different, but youre still able to drive. Basically the same thing, you can bring in your keys and yet the vehicle manually and start driving. Meanwhile, if you actually look at the dashboard, there are only two things that have changed. There is a button on the dashboard, its emergency stop. And theres actually a button behind the stick shift. Think of this as your, the autonomy equivalent of Cruise Control engage button. So you would actually engage this button to go into autonomous driving mode. So what you need to do is rotate the switch and then pull it upwards to get boo an autonomous into an autonomous driving mode, and that is a very conscious action on the drivers part. Host is this car licensed to drive the states of pittsburgh . Guest this in pennsylvania allow a vehicle to drive itself as long as two [inaudible] are satisfied. Number one, a licensed human driver is in the drivers seat and, number two, that a human can take over control at any point many time. So under those two conditions, vehicles can drive themselves on public roads. Host but this could drive as a normal car as well, correct . Guest yes, of course. To disengage the autonomous, one has to reflexively push this thing down. But you did not actually have the time to do that, you could grab hold of the Steering Wheel and turn it, you could press the brake pedals or the gas pedals, the vehicle will still respond. It waits for the human to take over. And this button here is for strictly emergency purposes, because we added a bunch of sensors, computers concern. [inaudible] and then engaged, for example, something totally unexpected happens like you start smelling smoke, you have no idea whats going on. The vehicles already driving, you just push that button in, it mechanically, electronically [inaudible] becomes a [inaudible] knock on wood, you dont have to engage that while driving yet. Host whats the cost after all the different systems that youve added to this car . Guest so because many of the sensors, radars and the cameras are really oneoff units, so they tend to be expensive. So it is an expensive vehicle because of that. But the real idea to be thinking about is that when the volumes go up, the costs will go down significantly. I guess our thinking that when these vehicles are mass produced, it would add about 5,000 on top of the but we think it would be very affordable to many people. Host all right. Well, youre going to give us a driverless ride. Is it truly driverless . Is that a fair word to use . Guest it is an Automated Vehicle are. It can drive itself under many conditions but not under all conditions yet. At least not yet. Host okay. Guest lets start driving. I guess ill take you to kind of [inaudible] explain a few more things. Host and youre driving manually. Guest it says manual on the screen. Host ark h. Ah. Guest lets point out a few things on the screen. It turns out that the screen, we just interfaced our computers to the same thing, so i guess if i flip back and forth, because this is the normal speed of the stock vehicle and then this is a screen that we added. By flipping a screen, we can go back and forth. Host and what are these images were seeing here . Guest so what you see on the screen, so this is the display which actually shows what the vehicle is doing at any point in time. So we as humans can feel comfortable that the vehicle is doing, indeed, doing the right thing. So a human can take over if something is not quite right. So what we see here are some icons here, and the icons, for example, say that i can launch, i can stop, i can automatically tell the vehicle to go to the airport or go to work. And this basically lets you zoom in or zoom out. And what you see on the screen is that you see two blue lines. They represent the lane that the vehicle can drive in host now, there are no lines on this parking lot that were in. Guest we have programmed the map of this parking lot, so the vehicle is basically allowed to drive along these lanes, and we go to the main public roads, basically that [inaudible] the map that that the gps Navigation System has. And then you see a green line there. We have already preprogrammed basically take a right out of this park next to where we are, so thats the route that your gps device calculates. Host okay. Guest gps has a builtin map database, and [inaudible] so that route is the green line that you see. You also see a very short line out there. That is basically the car knowing the green line route, knowing the blue line map, it basically uses its sensory data from the lidars, radars and cameras and basically says over the next 15, 20 meters, this is how this is where im going to drive. Host did you have to Program Every route ahead of time, or can it go out onto a a street its never been on before . Guest the it needs to have a map of the roads host a gps map. Guest a gps map, and then you need to basically tell you, tell the system where you want to go. It uses the map information and your Current Location to calculate the route just like a gps garmin device or google map. So thats the green line. Host okay. Guest and the red line is basically the vehicle host this little red line right here. Yes, okay. Guest and then so i guess now if i zoom back a little bit be, you see a bunch of dots on the screen host yeah. Guest those dots are the laser points from the laser sensors appear anything realtime. So what you see here is basically a bunch of yellow dots social host is this that white car . Guest thats the white car. And these white [inaudible] and Dark Navy Blue host dumpster . Guest that dumpster there. Host and all these white lights, these are the trees. Okay. Guest basically, able to sense the environment. We use our eyes and ears. In this case, the lasers, the cameras and the radars act as the eyes and ears of the vehicle. Host how far can it see . Guest about 100, 75 meters. But with the Connectivity Technology that we discussed earlier, it has builtin wireless communication radios that can actually talk to that can go as far as 600 meters. Host all right. Guest so lets do the following, lets engage the vehicle in autonomous mode. Im going to do the following, so the vehicle is actually in parking mode. Im going to basically engage the autonomous mode host while its in park. Guest while its in park. Autonomous driving. Guest it started driving. Host does this ever make you nervous . Guest i guess the normal reaction for anybody new at this is, hey, you feel a lot of anxiety [laughter] what happens if [laughter] raj is not paying attention anymore. Lets see how the vehicle does. Host okay. Dallas, you doing okay back there . Think so. Host and it turns on its turn signals because you told it where you wanted to go already, correct . Ing. Guest yes. Host okay, now, all right. It saw host so you did that, you hit the brakes. Guest i did not. Host oh, it hit the brakes. And it knows the speed limit. Guest yes. It sticks exactly to 25. It will do the legal speed limit is 25, pretty much everybody drives 35 or 40. But vehicle is a stickler to the rules. [laughter] so basically right now we have a vehicle behind us and, of course, the driver says why are you driving so slow. Host uhhuh. It seems to do a little meandering in the lane. Is that a Fair Assessment . Guest that could, it could be better, yes. Host but its just reading constantly. Guest yes. Host now, its going to make some guest this is a curvy, windy road. So im not controlling Steering Wheel or the brake pedal or the gas pedal, and it was able to shift transmission by itself. Host i see that. Now, that [inaudible] host okay. It sensed that . That biker . Guest yes, sir. Host okay. All right. How far have you come in 30 years . Guest we have come a very long way but still some ways to go to, basically, completely remove the human from the driving process. Host is this vehicle constantly learning . Guest this vehicle is not constantly learning by itself. We collect data and use the data actually to teach the software about new features, new functions. So it is not learning as we drive, its actually learning after the fact when we go back. Host how did it know there was a stop sign there . Guest the map basically has indications about where the stop line is. It does it all by itself. Host that was the car that did that. Guest the car did it all by itself, yes. Host it wasnt sure of its speed guest basically, slightly to the right and saw those parked cars host okay. Guest so let me do, i do need to get back, ill take over manually. I just push this down, in which case the vehicle has gone back to manual mode host okay. Autonomous ready. Host and you do that on the fly . Guest yes. Autonomous driving. Guest so we can switch back and forth seamlessly, if you want. Host and its seeing all of these things guest yes. Its not going to host theres a baby. Guest this crosswalk is not on the map, so it doesnt understand autonomous guest these have just been added recently. Host so its not quite ready to be sent out on a road its never been on before. Guest highways we have never been on, we can do that. Urban areas were a lot more careful because of pedestrians, bicyclers. We do it on highways but not on you urban corridors. Host okay. Bike. [inaudible] guest we are back on that curve, the curvy, winding road. Host can it read signs . Guest it can read some signs, yes. But not all signs. Turns out there are thousands of signs, it does not understand all of them. So there you see that red line there. The green is the path that it wants to take. It turns out that the red line is exactly on top of that. Host okay. How far have you driven in this car autonomously . Guest we have driven a total of about 20,000 miles or so autonomously. Not in one continuous stretch doing. Host whats the longest trip you have ever taken . Guest we have done a couple of hundred mileses on highways. And the miles on highways. And a shipoff that i created a spinoff that i created thats been used by delphi to drive from San Francisco to new york city, about a 500mile 3500mile journey, and the vehicle drove itself on highways about 96. 8 of the time. So highway is not a problem. Autonomous ready. Guest so you have taken your first ride in an Autonomous Car. Host when will we do this regularly as consumers . Guest simple question, basic question, ill give you a long and complex answer. You can already buy vehicles, for example, tesla with an autopilot future that the vehicle can drive itself, but the human must be paying attention. General motors, next year, will include a similar feature that they call supercruise where the vehicle can steer itself and apply the brakes and gas pedal as well, and that will be in the cadillac time next year. And many highend vehicles can already park themselves today. So some of these features are already available on the market, if you will. And then three to five years from now, we expect that the vehicles will be able to drive themselves, but in wellspecified, welldefined, geographicallyconstrained regions, and thats called geofencing, geographically fenced areas where basically, for example, pedestrians arent allowed, bicyclists are not allowed, and there is no heavy rain or heavy snow. California. So those areas can deploy some of the technologies earlier, but when you asked the question about when can the human not drive around, that basically implies that the technology should be able to drive a vehicle itself from any point a to any point b that you and i as experienced drivers can drive in the u. S. That capability is going to take at least ten years. We have come a very long way over the past couple of decades or so, but is till quite some ways still quite some ways to go before the human can take himself or herself out of the seat, go into the backseat and take a nap. Host have you allowed your can kids or wife with you in the Autonomous Car . Guest sure. We have many family members a go along. We have had many customers like yourself in the car. Host i was a little surprised we cant have to sign a release before we got in. [inaudible] lax, if you will. Thats because were just researchers. Host why are we talk talking to you about Autonomous Cars in pittsburgh rather than detroit or Silicon Valley . Guest so great question. It turns out that Carnegie Mellon is a globally well known as a strong reputation for computer science, engineering as well as robotics. We have an entity called the robotics stews on Campus Institute on campus which is internationally recognized and has more than a hundred researchers in it excluding students, if you will, and theyre all extremely knowledgeable about the area of robotics and [inaudible] in the field have been built at cmu since the early 1980s. In fact, we at Carnegie Mellon believe that we are a birthplace of Autonomous Vehicle technology dating back to about 1983 or so. A couple of years back in 2014 we literally celebrated the 30th birthday of this technology on campus. Host you said before we started this interview that computers are simultaneously very intelligent and very stupid. Guest yes. So computers are simultaneously very intelligent, they can do things that amaze us, right . They can actually react very quickly, and they can make decisions that to a normal person sounds extremely smart. How does it know that it should be driving at this speed, slow down there and so on. They look very intelligent, and they are very intelligent because they process this 360degree view of the vehicle with the multiple sensory data streams from lasers, radars and cameras. Very intelligent. But at the same time, they are stupid, if you will, because they dont really have common sense. For example, we know that when we fall down or when we basically touch fire, it hurts, and the next time you wont do it. But computers or cannot make that simple inference saying, hey, i crashed into somebody, next time dont do that. Itll do the exact same thing unless it is programmed to do Something Else specifically by a human being. This is the 11th generation. It is being run by [inaudible] the vehicle on the right is a cadillac that we are able to drive today. So that the vehicle is created by the project that i lead with support from General Motors through department of transportation. So because of our close working relationship with gm we are extremely sensitive to the aesthetics of the vehicle, the exterior and injure as well. It looks very normal, thats something gym would be proud to sell. Host how far along armed with this technology . Guest the technology has

© 2025 Vimarsana

comparemela.com © 2020. All Rights Reserved.