0 00:00:00,000 --> 00:00:30,000 Dear viewer, these subtitles were generated by a machine via the service Trint and therefore are (very) buggy. If you are capable, please help us to create good quality subtitles: https://c3subtitles.de/talk/1709 Thanks! 1 00:00:16,260 --> 00:00:18,329 So hello, everybody. 2 00:00:18,330 --> 00:00:20,279 Welcome to the exciting stage. 3 00:00:20,280 --> 00:00:22,199 First talk on day three. 4 00:00:23,310 --> 00:00:25,979 We are really happy to have Alastair and 5 00:00:25,980 --> 00:00:26,980 ours here. 6 00:00:27,860 --> 00:00:28,860 All right. 7 00:00:29,490 --> 00:00:32,069 They're from the French Cocoa Group, 8 00:00:32,070 --> 00:00:34,709 which is an interdisciplinary group 9 00:00:34,710 --> 00:00:37,319 that is researching stuff and 10 00:00:37,320 --> 00:00:39,419 trying to, on one 11 00:00:39,420 --> 00:00:41,609 side, work on their own, but also get 12 00:00:41,610 --> 00:00:43,679 together, discuss, debunk ideas 13 00:00:43,680 --> 00:00:45,509 and help each other with their projects. 14 00:00:46,530 --> 00:00:48,869 Defend Cocoa Group was founded in 15 00:00:48,870 --> 00:00:50,999 2013 by ours, if 16 00:00:51,000 --> 00:00:53,159 I remember 17 00:00:53,160 --> 00:00:54,389 correctly. 18 00:00:54,390 --> 00:00:56,969 And yeah, they're 19 00:00:56,970 --> 00:00:59,279 researching a wide range of interesting 20 00:00:59,280 --> 00:01:00,450 and relevant topics, 21 00:01:01,560 --> 00:01:03,749 and they are formed up to now 22 00:01:03,750 --> 00:01:06,299 non funded and completely autonomous. 23 00:01:06,300 --> 00:01:08,759 Which brings us also to the subject 24 00:01:08,760 --> 00:01:11,459 of our talk, because 25 00:01:11,460 --> 00:01:13,409 those two are going to talk about 26 00:01:13,410 --> 00:01:14,909 self-driving cars. 27 00:01:14,910 --> 00:01:17,039 And yeah, I'm really 28 00:01:17,040 --> 00:01:19,019 interested in what you have to tell and 29 00:01:19,020 --> 00:01:20,020 to stay tuned. Yes. 30 00:01:23,820 --> 00:01:24,869 Thank you. 31 00:01:24,870 --> 00:01:26,399 I just needed to unmute myself. 32 00:01:26,400 --> 00:01:27,539 Thank you so much for this kind 33 00:01:27,540 --> 00:01:28,540 introduction. 34 00:01:31,050 --> 00:01:33,119 And before we 35 00:01:33,120 --> 00:01:34,120 get anything 36 00:01:35,220 --> 00:01:37,319 wrong out there, it was a bit of 37 00:01:37,320 --> 00:01:38,759 a bit of a misunderstanding. 38 00:01:38,760 --> 00:01:40,979 This is not part of and Coco. 39 00:01:40,980 --> 00:01:42,359 Maybe I can. 40 00:01:42,360 --> 00:01:44,429 Maybe I can convince him to 41 00:01:44,430 --> 00:01:46,129 be part of a coca one time. 42 00:01:46,130 --> 00:01:47,819 That's fine. No, but he's taking part. 43 00:01:47,820 --> 00:01:48,839 That's fine. 44 00:01:48,840 --> 00:01:50,549 So we're going to talk about let us go. 45 00:01:50,550 --> 00:01:51,779 In Lydia's case, we're going to talk 46 00:01:51,780 --> 00:01:53,839 about the case 100 billion 47 00:01:53,840 --> 00:01:56,369 dollar delusions with self-driving cars, 48 00:01:56,370 --> 00:01:59,249 and we are talking 49 00:01:59,250 --> 00:02:01,919 about it as a delusion and 50 00:02:01,920 --> 00:02:04,439 hoping to create an interesting talk 51 00:02:04,440 --> 00:02:07,049 with all touching all kinds of topics. 52 00:02:07,050 --> 00:02:08,999 But we're going to end up with a question 53 00:02:09,000 --> 00:02:11,289 of whether the car in itself should 54 00:02:11,290 --> 00:02:13,619 be abolished and of course, what 55 00:02:13,620 --> 00:02:15,359 interesting mobility concepts would 56 00:02:15,360 --> 00:02:17,819 follow if we try 57 00:02:17,820 --> 00:02:20,399 to do a encompassing 58 00:02:20,400 --> 00:02:22,469 futuristic mobility that's good for 59 00:02:22,470 --> 00:02:23,470 everybody. 60 00:02:24,180 --> 00:02:25,529 Thank you. 61 00:02:25,530 --> 00:02:27,389 Let's start with you. 62 00:02:27,390 --> 00:02:29,339 I'm sure you all know here he is, looking 63 00:02:29,340 --> 00:02:30,939 extremely serious. 64 00:02:30,940 --> 00:02:31,949 Is a very serious guy. 65 00:02:31,950 --> 00:02:34,979 As you know, this is him in 2019. 66 00:02:34,980 --> 00:02:36,659 I think we'll be feature complete and 67 00:02:36,660 --> 00:02:38,939 self-driving this year, meaning the car 68 00:02:38,940 --> 00:02:41,069 will be able to find 69 00:02:41,070 --> 00:02:42,839 you in a parking lot, pick you up, take 70 00:02:42,840 --> 00:02:44,369 it all the way to your destination 71 00:02:44,370 --> 00:02:45,899 without any intervention. 72 00:02:45,900 --> 00:02:47,159 And just to make sure you know how 73 00:02:47,160 --> 00:02:48,209 serious he is, he says. 74 00:02:48,210 --> 00:02:49,349 I am certain of that. 75 00:02:49,350 --> 00:02:51,179 That is not a question mark. 76 00:02:51,180 --> 00:02:53,729 So let's see how they're getting on. 77 00:02:53,730 --> 00:02:55,559 They actually released Autopilot a year 78 00:02:55,560 --> 00:02:56,669 later. 79 00:02:56,670 --> 00:02:58,619 A bit later than they planned. 80 00:02:58,620 --> 00:02:59,879 And it's now in beta. 81 00:02:59,880 --> 00:03:00,880 Here we go. 82 00:03:03,610 --> 00:03:05,840 You can let it go out. 83 00:03:08,410 --> 00:03:11,349 Oh, oh, oh, jeez. 84 00:03:11,350 --> 00:03:12,489 Oh my God. 85 00:03:12,490 --> 00:03:14,249 Yeah. 86 00:03:14,250 --> 00:03:16,269 Oh, give us a good example, a good 87 00:03:16,270 --> 00:03:17,830 example of this. Still, Peter 88 00:03:19,710 --> 00:03:22,719 does have control on terms who industry 89 00:03:22,720 --> 00:03:25,569 just left him, just stared 90 00:03:25,570 --> 00:03:26,570 directly into it 91 00:03:27,700 --> 00:03:29,799 into the back of this parked car, and it 92 00:03:29,800 --> 00:03:30,800 wasn't going to break. 93 00:03:32,290 --> 00:03:33,759 So it's still detecting those rings on 94 00:03:33,760 --> 00:03:34,810 the road as a 95 00:03:36,760 --> 00:03:37,780 cheese. 96 00:03:38,890 --> 00:03:41,619 Yeah. Oh my gosh. 97 00:03:41,620 --> 00:03:43,419 Well, this is why we don't have people 98 00:03:43,420 --> 00:03:44,420 with us normally. 99 00:03:45,990 --> 00:03:48,119 OK, so, so there might be 100 00:03:48,120 --> 00:03:49,799 a few problems there. 101 00:03:49,800 --> 00:03:51,509 And this is really what our talk is 102 00:03:51,510 --> 00:03:52,739 about. 103 00:03:52,740 --> 00:03:54,359 Over the last few years have been 104 00:03:54,360 --> 00:03:56,609 hundreds of billions of dollars spent in 105 00:03:56,610 --> 00:03:57,610 this field. 106 00:03:58,650 --> 00:04:00,959 And we got that from a survey in 2017 107 00:04:00,960 --> 00:04:02,609 where he said 80 billion and we know that 108 00:04:02,610 --> 00:04:04,679 there was quite a few of the billions 109 00:04:04,680 --> 00:04:05,939 more than that have been spent in the 110 00:04:05,940 --> 00:04:08,369 last few years where it's really peaked. 111 00:04:08,370 --> 00:04:10,469 And that includes startups, including 112 00:04:10,470 --> 00:04:12,519 the big tech companies, as you know, in 113 00:04:12,520 --> 00:04:14,999 automotive vehicles, plus a whole network 114 00:04:15,000 --> 00:04:17,219 of other suppliers and consultants. 115 00:04:17,220 --> 00:04:18,479 And we call this for the sake of 116 00:04:18,480 --> 00:04:21,239 convenience, the technology mobility 117 00:04:21,240 --> 00:04:22,169 complex. 118 00:04:22,170 --> 00:04:23,819 And they're all convinced and trying to 119 00:04:23,820 --> 00:04:25,919 convince us that self-driving 120 00:04:25,920 --> 00:04:28,199 cars are just around the corner. 121 00:04:28,200 --> 00:04:30,059 So how are they getting along? 122 00:04:30,060 --> 00:04:32,429 Well, we've seen Tesla, 123 00:04:32,430 --> 00:04:33,899 and certainly they have something to 124 00:04:33,900 --> 00:04:36,629 market. It is in beta or $7000 125 00:04:36,630 --> 00:04:38,199 or somewhere around that, and they've 126 00:04:38,200 --> 00:04:40,319 been three fatal crashes so far. 127 00:04:40,320 --> 00:04:42,599 It's very limited compared to what 128 00:04:42,600 --> 00:04:43,860 they said it would be. 129 00:04:45,300 --> 00:04:46,689 You've got Audi, for example. 130 00:04:46,690 --> 00:04:48,419 We've also tried to bring something to 131 00:04:48,420 --> 00:04:49,739 market. 132 00:04:49,740 --> 00:04:52,169 This is a song that has jam 133 00:04:52,170 --> 00:04:53,170 pilots. 134 00:04:53,820 --> 00:04:55,979 They convince regulators in 135 00:04:55,980 --> 00:04:58,229 Germany that it was perfectly safe 136 00:04:58,230 --> 00:05:00,449 to read your emails whilst driving 137 00:05:00,450 --> 00:05:02,459 with with jam pilot on. 138 00:05:02,460 --> 00:05:04,499 Unfortunately, they didn't quite convince 139 00:05:04,500 --> 00:05:07,469 any other regulator in any other country. 140 00:05:07,470 --> 00:05:09,329 And as a result, it's been withdrawn from 141 00:05:09,330 --> 00:05:11,759 the market this year, basically because 142 00:05:11,760 --> 00:05:13,409 if there was an accident, it would 143 00:05:13,410 --> 00:05:14,970 effectively be out his fault. 144 00:05:16,170 --> 00:05:17,969 Then you have, of course, the tech 145 00:05:17,970 --> 00:05:19,379 companies, the big tech companies. 146 00:05:19,380 --> 00:05:21,209 Uber has been very prominent in this 147 00:05:21,210 --> 00:05:23,429 field looking for mobility 148 00:05:23,430 --> 00:05:25,709 as a service. That's a concept will 149 00:05:25,710 --> 00:05:27,209 come back to. 150 00:05:27,210 --> 00:05:28,709 They started in 2015. 151 00:05:28,710 --> 00:05:30,509 They've invested over a billion dollars, 152 00:05:31,560 --> 00:05:33,389 but unfortunately, Uber has had some 153 00:05:33,390 --> 00:05:34,469 issues too. 154 00:05:34,470 --> 00:05:36,269 And we'll talk more about that later on. 155 00:05:36,270 --> 00:05:38,369 But there was a fatal crash in 2018. 156 00:05:38,370 --> 00:05:40,349 In November this year, just a few weeks 157 00:05:40,350 --> 00:05:41,729 ago, they announced they were giving up 158 00:05:41,730 --> 00:05:43,169 altogether. 159 00:05:43,170 --> 00:05:45,479 They sold off and, we should say, 160 00:05:45,480 --> 00:05:47,849 paid to give away 161 00:05:47,850 --> 00:05:49,979 their self-driving project to 162 00:05:49,980 --> 00:05:51,209 another company. 163 00:05:52,980 --> 00:05:55,079 Amazon got into the field with 164 00:05:55,080 --> 00:05:57,249 this startup, Zoox 165 00:05:57,250 --> 00:06:00,179 and very recently, a little late 166 00:06:00,180 --> 00:06:01,139 to the game. 167 00:06:01,140 --> 00:06:03,749 But ASICs has gone for the all 168 00:06:03,750 --> 00:06:06,159 completely autonomous and its 169 00:06:06,160 --> 00:06:08,519 new build design. 170 00:06:08,520 --> 00:06:10,529 This was released a few weeks ago. 171 00:06:10,530 --> 00:06:13,079 However, they were a little bit coy 172 00:06:13,080 --> 00:06:14,909 about when this will actually be working 173 00:06:14,910 --> 00:06:16,349 on the streets. Or they would say, is 174 00:06:16,350 --> 00:06:18,989 it's definitely not going to be 2021. 175 00:06:18,990 --> 00:06:20,459 Our view is it's probably going to be a 176 00:06:20,460 --> 00:06:21,460 bit later than that. 177 00:06:22,590 --> 00:06:25,019 And then you have the old automakers also 178 00:06:25,020 --> 00:06:27,209 involved. And what most prominence among 179 00:06:27,210 --> 00:06:29,429 those is General Motors, who bought up 180 00:06:29,430 --> 00:06:30,869 Cruze. 181 00:06:30,870 --> 00:06:32,459 They've been investing in this, and since 182 00:06:32,460 --> 00:06:34,799 2013, they claim 183 00:06:34,800 --> 00:06:36,929 to have a commercial taxi versus a robo 184 00:06:36,930 --> 00:06:39,809 taxi service available 185 00:06:39,810 --> 00:06:41,699 by the end of 2019. 186 00:06:41,700 --> 00:06:44,519 Well, that hasn't quite materialized. 187 00:06:44,520 --> 00:06:46,349 If you go to San Francisco, you certainly 188 00:06:46,350 --> 00:06:48,659 see the cruise cars all around. 189 00:06:48,660 --> 00:06:50,939 They've recently announced and very, 190 00:06:50,940 --> 00:06:52,409 very happy about this. 191 00:06:52,410 --> 00:06:53,639 They recently announced they have 192 00:06:53,640 --> 00:06:55,889 driverless vehicles for the first 193 00:06:55,890 --> 00:06:58,709 time on five streets in San Francisco. 194 00:06:58,710 --> 00:07:01,139 They will only be working in low traffic 195 00:07:01,140 --> 00:07:03,209 and at night, and there's will be one 196 00:07:03,210 --> 00:07:05,189 person in the car with an emergency stop 197 00:07:05,190 --> 00:07:07,259 button. So a little far away 198 00:07:07,260 --> 00:07:09,420 from a commercial service. 199 00:07:10,530 --> 00:07:11,729 And then, of course, you had the big 200 00:07:11,730 --> 00:07:15,029 player always, which is Waymo, 201 00:07:15,030 --> 00:07:16,829 which is, of course, owned by Google. 202 00:07:16,830 --> 00:07:19,229 They've been around since 2009 203 00:07:19,230 --> 00:07:21,869 with a number of different iterations. 204 00:07:21,870 --> 00:07:23,639 You can see the first version they have 205 00:07:23,640 --> 00:07:25,919 here, the Firefly, but someone 206 00:07:25,920 --> 00:07:27,569 must have been very rude about that to 207 00:07:27,570 --> 00:07:29,429 somebody at Google because these are the 208 00:07:29,430 --> 00:07:31,559 latest versions you've got here, which 209 00:07:31,560 --> 00:07:33,149 are obviously really, really mean 210 00:07:33,150 --> 00:07:34,649 Jaguars. 211 00:07:34,650 --> 00:07:36,779 Anyway, they've been going in Phenix 212 00:07:36,780 --> 00:07:39,929 for a long time since 2017. 213 00:07:39,930 --> 00:07:42,029 They announced in 2018 214 00:07:42,030 --> 00:07:43,799 that they'd have a fully driverless taxi 215 00:07:43,800 --> 00:07:45,689 service. Then they announced it again in 216 00:07:45,690 --> 00:07:47,339 2019. 217 00:07:47,340 --> 00:07:48,989 It's taking a little bit longer. 218 00:07:48,990 --> 00:07:51,059 They finally announced it again in 2020, 219 00:07:51,060 --> 00:07:53,309 and they actually kind of do have 220 00:07:53,310 --> 00:07:55,949 driverless taxis in Phenix, 221 00:07:55,950 --> 00:07:58,019 but it's limited to a 50 square mile 222 00:07:58,020 --> 00:07:59,159 area. 223 00:07:59,160 --> 00:08:01,739 They actually have super sized remotely. 224 00:08:01,740 --> 00:08:04,049 So somebody can intervene 225 00:08:04,050 --> 00:08:06,119 from afar, a safety 226 00:08:06,120 --> 00:08:08,249 driver effectively remotely dialing 227 00:08:08,250 --> 00:08:10,469 in. And they're also 228 00:08:10,470 --> 00:08:11,799 they need perfect weather. 229 00:08:11,800 --> 00:08:13,919 So again, very limited 230 00:08:13,920 --> 00:08:16,049 and is always in where you see 231 00:08:16,050 --> 00:08:17,849 the successes in this area, it's always 232 00:08:17,850 --> 00:08:20,189 in these highly controlled environments. 233 00:08:20,190 --> 00:08:22,289 Phenix is an ideal place 234 00:08:22,290 --> 00:08:24,629 with perfect weather, easy streets, no 235 00:08:24,630 --> 00:08:25,630 hills. 236 00:08:26,190 --> 00:08:28,379 And then you've got other projects like 237 00:08:28,380 --> 00:08:30,689 this the Ford Argo. 238 00:08:30,690 --> 00:08:32,989 Again, a lot of investment two 239 00:08:32,990 --> 00:08:35,339 billion from Ford, even further 240 00:08:35,340 --> 00:08:37,439 investment from Volkswagen, and 241 00:08:37,440 --> 00:08:39,449 they're developing delivering fruit and 242 00:08:39,450 --> 00:08:41,489 veg in Miami, which is very nice again in 243 00:08:41,490 --> 00:08:42,779 this very small area. 244 00:08:42,780 --> 00:08:45,519 But that requires two safety drivers. 245 00:08:45,520 --> 00:08:46,839 And then he get this, which is a very 246 00:08:46,840 --> 00:08:48,459 successful project. 247 00:08:48,460 --> 00:08:51,659 The optimists right project 248 00:08:51,660 --> 00:08:53,889 that's in a in a retirement 249 00:08:53,890 --> 00:08:55,449 community, and those are the kinds of 250 00:08:55,450 --> 00:08:57,579 places where self-driving cars 251 00:08:57,580 --> 00:08:58,749 seem to be working. 252 00:08:58,750 --> 00:09:00,879 Not so much on busy urban 253 00:09:00,880 --> 00:09:01,880 streets. 254 00:09:02,800 --> 00:09:04,299 And you could go further than that. 255 00:09:04,300 --> 00:09:06,519 And this is a quote from Dr. Jill Pratt, 256 00:09:06,520 --> 00:09:08,679 head of Toyota Research. 257 00:09:08,680 --> 00:09:10,179 She says I would challenge anyone in the 258 00:09:10,180 --> 00:09:12,249 automated driving field to give 259 00:09:12,250 --> 00:09:14,499 a rational basis for when Level 260 00:09:14,500 --> 00:09:16,719 five will be available, 261 00:09:16,720 --> 00:09:18,339 and we'll talk a bit about Level five, 262 00:09:18,340 --> 00:09:19,719 but that means a fully autonomous 263 00:09:19,720 --> 00:09:22,239 vehicle, which 264 00:09:22,240 --> 00:09:23,649 as we would expect. 265 00:09:25,270 --> 00:09:27,399 And you know, here's 266 00:09:27,400 --> 00:09:29,499 John Krafcik, the CEO 267 00:09:29,500 --> 00:09:30,849 of Waymo. 268 00:09:30,850 --> 00:09:32,749 In a candid moment a couple of years ago, 269 00:09:32,750 --> 00:09:34,569 he conceded, You know what? 270 00:09:34,570 --> 00:09:36,699 Self-driving car technology is actually 271 00:09:36,700 --> 00:09:38,679 really, really hard. 272 00:09:38,680 --> 00:09:40,630 Who could have possibly thought that? 273 00:09:41,650 --> 00:09:43,779 And I'm going to hand over to auris to 274 00:09:43,780 --> 00:09:46,059 explain a little bit more about why 275 00:09:46,060 --> 00:09:47,060 that is. 276 00:09:48,070 --> 00:09:50,829 So we're going to talk about 277 00:09:50,830 --> 00:09:53,349 first. Why do we call it a self-driving 278 00:09:53,350 --> 00:09:54,529 car? If I may. 279 00:09:54,530 --> 00:09:56,799 And so to answer 280 00:09:56,800 --> 00:09:58,929 it, we need to look at language and 281 00:10:00,040 --> 00:10:01,989 technology. And in general, it is 282 00:10:01,990 --> 00:10:03,399 interesting to note that 283 00:10:04,450 --> 00:10:06,609 words gain meaning 284 00:10:06,610 --> 00:10:08,679 through the use they 285 00:10:08,680 --> 00:10:11,649 use. And they can they can. 286 00:10:11,650 --> 00:10:13,689 If you want to say that, lose meaning, 287 00:10:13,690 --> 00:10:15,759 but they can truly change meaning 288 00:10:15,760 --> 00:10:17,829 and get ambiguous by the 289 00:10:17,830 --> 00:10:20,049 wide acceptance of more than one 290 00:10:20,050 --> 00:10:22,209 meaning in society 291 00:10:22,210 --> 00:10:24,429 and autonomous has, by now, not 292 00:10:24,430 --> 00:10:26,829 only one anymore. 293 00:10:26,830 --> 00:10:28,419 If we want to say the least. 294 00:10:28,420 --> 00:10:30,479 So if you look at a 295 00:10:30,480 --> 00:10:32,200 definition of an autonomous car. 296 00:10:33,640 --> 00:10:35,139 Hang on, hang on, stay with me. 297 00:10:35,140 --> 00:10:37,269 If we look at the definition of 298 00:10:37,270 --> 00:10:39,009 an autonomous car, you get the obligatory 299 00:10:39,010 --> 00:10:42,159 definition of a vehicle 300 00:10:42,160 --> 00:10:44,439 capable of sensing its environment 301 00:10:44,440 --> 00:10:46,119 and operating without a human 302 00:10:46,120 --> 00:10:48,489 environment. A human passenger is not 303 00:10:48,490 --> 00:10:50,559 required to take control of the 304 00:10:50,560 --> 00:10:52,719 vehicle at any time, nor is 305 00:10:52,720 --> 00:10:54,879 a human passenger required to be present 306 00:10:54,880 --> 00:10:56,259 in the vehicle at all. 307 00:10:56,260 --> 00:10:58,489 And an autonomous car can go anywhere 308 00:10:58,490 --> 00:11:00,429 the traditional car goes and does 309 00:11:00,430 --> 00:11:02,529 everything that an experienced human 310 00:11:02,530 --> 00:11:03,579 driver does. 311 00:11:03,580 --> 00:11:05,229 Now that is a high aim, 312 00:11:06,520 --> 00:11:07,749 would you? Yeah. 313 00:11:07,750 --> 00:11:10,659 So we land in impractical ambiguities 314 00:11:10,660 --> 00:11:12,729 with this, not only because the 315 00:11:12,730 --> 00:11:15,219 traditional definition of autonomy 316 00:11:15,220 --> 00:11:16,659 is something completely different. 317 00:11:16,660 --> 00:11:18,489 I'd like to share with you just a very 318 00:11:18,490 --> 00:11:21,189 short version of 319 00:11:21,190 --> 00:11:23,079 what we read when we type it into the 320 00:11:23,080 --> 00:11:25,179 Stanford Encyclopedia of 321 00:11:25,180 --> 00:11:27,459 Philosophy, which is a nice 322 00:11:27,460 --> 00:11:29,949 source and it gives you individual 323 00:11:29,950 --> 00:11:31,929 autonomy is an idea that is generally 324 00:11:31,930 --> 00:11:33,879 understood and referred to to the 325 00:11:33,880 --> 00:11:36,189 capacity to one's own person, 326 00:11:36,190 --> 00:11:38,349 to one's life, 327 00:11:38,350 --> 00:11:40,449 according to reasons and motives taken 328 00:11:40,450 --> 00:11:42,519 as one's own and not 329 00:11:42,520 --> 00:11:45,129 as the product of manipulative or distort 330 00:11:45,130 --> 00:11:47,199 distorting external forces. 331 00:11:47,200 --> 00:11:49,239 Now, this is not a new point. 332 00:11:49,240 --> 00:11:52,089 This point about autonomy has been made 333 00:11:52,090 --> 00:11:53,829 quite a number of times, especially in 334 00:11:53,830 --> 00:11:56,259 terms of autonomous cars bus, but 335 00:11:56,260 --> 00:11:59,109 it is important to mention nonetheless. 336 00:11:59,110 --> 00:12:00,249 And we will see 337 00:12:01,360 --> 00:12:03,459 how difficult and how ambiguous 338 00:12:03,460 --> 00:12:05,829 it is when we go to the automation levels 339 00:12:05,830 --> 00:12:08,559 that are utterly defined via automation. 340 00:12:08,560 --> 00:12:10,209 Sorry. The automation levels that are 341 00:12:10,210 --> 00:12:12,579 utterly defined by our autonomy 342 00:12:12,580 --> 00:12:13,869 and autonomy is supposed to be the more 343 00:12:13,870 --> 00:12:15,879 complex concept. 344 00:12:15,880 --> 00:12:18,229 And this is supposed to, you know, 345 00:12:18,230 --> 00:12:20,319 seems a bit the other way around. 346 00:12:20,320 --> 00:12:22,419 So the society 347 00:12:22,420 --> 00:12:24,669 of the automotive engineers. 348 00:12:24,670 --> 00:12:27,309 Oh, by the way, insurers have 349 00:12:27,310 --> 00:12:29,499 have identified autonomous 350 00:12:29,500 --> 00:12:31,809 ambiguity as a 351 00:12:31,810 --> 00:12:32,950 potential 352 00:12:34,030 --> 00:12:36,099 reason for an increase in 353 00:12:36,100 --> 00:12:37,990 crashes due to confusion. 354 00:12:39,850 --> 00:12:41,799 So that's not. It's not only 355 00:12:43,360 --> 00:12:45,580 a point of philosophical 356 00:12:47,350 --> 00:12:48,350 extent. 357 00:12:49,450 --> 00:12:51,009 So the society of Ultimate, yeah, that's 358 00:12:51,010 --> 00:12:53,169 it. Thank you. The Society of 359 00:12:53,170 --> 00:12:55,269 Automotive Engineers, the SAIC currently 360 00:12:55,270 --> 00:12:57,399 finds six levels starting by zero, 361 00:12:57,400 --> 00:12:59,379 which is why we ended five, obviously. 362 00:12:59,380 --> 00:13:00,380 And. 363 00:13:02,560 --> 00:13:04,779 These levels go from fully 364 00:13:04,780 --> 00:13:07,419 manual to fully autonomous, 365 00:13:07,420 --> 00:13:09,399 and these levels have been adopted by the 366 00:13:09,400 --> 00:13:11,679 U.S. Department of Transportation. 367 00:13:11,680 --> 00:13:13,569 Now there is a standard out there that 368 00:13:13,570 --> 00:13:15,699 I'm going to going to come to and look 369 00:13:15,700 --> 00:13:17,829 at the wording that they use to 370 00:13:17,830 --> 00:13:18,969 be sure what we're talking about. 371 00:13:18,970 --> 00:13:21,039 But in general, it's it's it's safe 372 00:13:21,040 --> 00:13:23,319 to say level one assumes 373 00:13:23,320 --> 00:13:25,239 that a system can assist the driver with 374 00:13:25,240 --> 00:13:27,429 one driving task just one. 375 00:13:27,430 --> 00:13:29,619 And so it fits into 376 00:13:29,620 --> 00:13:31,809 this category, which 377 00:13:31,810 --> 00:13:34,119 is the 378 00:13:34,120 --> 00:13:36,399 adaptive cruise control HCC 379 00:13:36,400 --> 00:13:38,589 and level two systems such as 380 00:13:38,590 --> 00:13:40,509 Pilot Assist can assist with, for 381 00:13:40,510 --> 00:13:42,639 example, two tasks, and Level two is 382 00:13:42,640 --> 00:13:44,469 the highest level of automation currently 383 00:13:44,470 --> 00:13:47,709 available that will lead to 384 00:13:47,710 --> 00:13:49,959 a discussion on strategy and development 385 00:13:49,960 --> 00:13:52,299 that that the one is trying to erase 386 00:13:52,300 --> 00:13:54,009 the driver and the one is trying to kind 387 00:13:54,010 --> 00:13:55,959 of see through the driver and learn from 388 00:13:55,960 --> 00:13:57,129 the driver. 389 00:13:57,130 --> 00:13:58,809 The first one is trying to skip level 390 00:13:58,810 --> 00:14:00,099 three, and the second one is kind of 391 00:14:00,100 --> 00:14:02,289 putting great emphasis on 392 00:14:02,290 --> 00:14:04,509 it. But to understand where 393 00:14:04,510 --> 00:14:06,219 that lands with all the assistance 394 00:14:06,220 --> 00:14:08,479 systems and the the wording, we need 395 00:14:08,480 --> 00:14:10,029 to map it somehow, which is why I looked 396 00:14:10,030 --> 00:14:12,129 at the taxonomy and the definitions 397 00:14:12,130 --> 00:14:15,099 for related terms 398 00:14:15,100 --> 00:14:17,169 to driving automation and systems 399 00:14:17,170 --> 00:14:19,239 for on road motor vehicles, which 400 00:14:19,240 --> 00:14:21,969 just the long ass name for basically 401 00:14:21,970 --> 00:14:23,429 the standard J. 402 00:14:23,430 --> 00:14:25,929 36, 20 403 00:14:25,930 --> 00:14:28,569 18 06, which is 404 00:14:28,570 --> 00:14:30,189 the standard, which is where it's all 405 00:14:30,190 --> 00:14:31,359 defined. 406 00:14:31,360 --> 00:14:34,209 And this standard gives you 407 00:14:34,210 --> 00:14:35,979 roughly this explanation. 408 00:14:35,980 --> 00:14:38,079 And me as a philosopher, I need 409 00:14:38,080 --> 00:14:39,999 some, some proper words. 410 00:14:40,000 --> 00:14:41,469 And I do. 411 00:14:41,470 --> 00:14:43,509 As I said, I am utterly aware of the fact 412 00:14:43,510 --> 00:14:45,309 that we could use in different contexts 413 00:14:45,310 --> 00:14:47,319 words differently due to different 414 00:14:47,320 --> 00:14:49,389 reasons. But we should also make sure 415 00:14:49,390 --> 00:14:51,489 that we do understand each other 416 00:14:51,490 --> 00:14:53,439 and the context in which we use those 417 00:14:53,440 --> 00:14:55,569 words. And with an 418 00:14:55,570 --> 00:14:57,789 and we should be clear on at 419 00:14:57,790 --> 00:14:59,859 least our own extent of meaning 420 00:14:59,860 --> 00:15:02,200 when we use the words and 421 00:15:03,210 --> 00:15:05,649 and then especially with 422 00:15:05,650 --> 00:15:07,509 if it's used by others, sorry, I've got a 423 00:15:07,510 --> 00:15:08,510 little, 424 00:15:09,610 --> 00:15:12,969 uh, scrolling problem here. 425 00:15:12,970 --> 00:15:15,039 OK, so this document that 426 00:15:15,040 --> 00:15:16,809 I was talking about refers to three 427 00:15:16,810 --> 00:15:18,309 primary actors and driving the human 428 00:15:18,310 --> 00:15:20,289 user, the driving automation system and 429 00:15:20,290 --> 00:15:22,569 interestingly, other vehicle systems 430 00:15:22,570 --> 00:15:24,699 and components and these other 431 00:15:24,700 --> 00:15:27,039 vehicle systems and components 432 00:15:27,040 --> 00:15:29,559 or the vehicle in general terms. 433 00:15:29,560 --> 00:15:30,849 Sorry, I'm having serious trouble 434 00:15:30,850 --> 00:15:31,850 scrolling here. 435 00:15:36,060 --> 00:15:38,219 So it boils down to 436 00:15:38,220 --> 00:15:40,739 processing model modules and operating 437 00:15:40,740 --> 00:15:42,689 codes that overlap in the automation 438 00:15:42,690 --> 00:15:43,979 system and the subsystems that are 439 00:15:43,980 --> 00:15:46,679 supposed to somewhat actor, primarily 440 00:15:46,680 --> 00:15:47,939 actor wise, can we go back? 441 00:15:47,940 --> 00:15:50,429 One primarily actor wise 442 00:15:50,430 --> 00:15:52,179 are supposed to be distinguished. 443 00:15:52,180 --> 00:15:53,519 You know, I'm going to be through with 444 00:15:53,520 --> 00:15:55,169 this in a second. But just so you know, 445 00:15:55,170 --> 00:15:58,019 these automated automation levels 446 00:15:58,020 --> 00:16:00,539 are defined by the role of those primary 447 00:16:00,540 --> 00:16:02,699 actors and how they act in 448 00:16:02,700 --> 00:16:03,659 traffic. 449 00:16:03,660 --> 00:16:05,459 So they're trying to match the automation 450 00:16:05,460 --> 00:16:07,799 levels onto the dynamic driving task 451 00:16:07,800 --> 00:16:09,659 performance and the DDT. 452 00:16:09,660 --> 00:16:12,269 So dynamic driving task DDT fallbacks, 453 00:16:12,270 --> 00:16:13,989 which is usually the driver, especially 454 00:16:13,990 --> 00:16:15,359 in the systems that we're talking about 455 00:16:15,360 --> 00:16:16,349 nowadays. 456 00:16:16,350 --> 00:16:18,209 But this is supposed to be done by the 457 00:16:18,210 --> 00:16:19,679 system completely, which we're going to 458 00:16:19,680 --> 00:16:21,239 talk about a little bit later. 459 00:16:21,240 --> 00:16:24,029 So, for example, 460 00:16:24,030 --> 00:16:26,279 a driver is oh yeah, and 461 00:16:26,280 --> 00:16:28,529 it's necessary to see that it's about 462 00:16:28,530 --> 00:16:30,659 the the way that the system 463 00:16:30,660 --> 00:16:32,939 is designed, not necessarily 464 00:16:32,940 --> 00:16:35,279 the actual performance of a given primary 465 00:16:35,280 --> 00:16:37,439 actor. For example, a driver, for 466 00:16:37,440 --> 00:16:40,109 example, who fails to monitor 467 00:16:40,110 --> 00:16:42,299 the roadway during engagement 468 00:16:42,300 --> 00:16:44,579 of a level one adaptive cruise control 469 00:16:44,580 --> 00:16:46,709 system still has the role of 470 00:16:46,710 --> 00:16:48,239 a driver, even though he or she is 471 00:16:48,240 --> 00:16:50,279 neglecting it, which is basically the 472 00:16:50,280 --> 00:16:51,959 most easy example that you can pick and 473 00:16:51,960 --> 00:16:53,849 all the others bring you into actual 474 00:16:53,850 --> 00:16:55,409 trouble. This one seems clear, but the 475 00:16:55,410 --> 00:16:57,179 others really don't. 476 00:16:57,180 --> 00:16:59,369 OK, so we're talking 477 00:16:59,370 --> 00:17:02,039 problematic in decision making 478 00:17:02,040 --> 00:17:03,040 and 479 00:17:04,950 --> 00:17:08,249 and predicting and responsibilities. 480 00:17:08,250 --> 00:17:10,499 So these levels apply 481 00:17:10,500 --> 00:17:12,210 to the driving automation. 482 00:17:13,900 --> 00:17:15,098 And you can see what I've just been 483 00:17:15,099 --> 00:17:16,099 talking about 484 00:17:18,190 --> 00:17:20,618 here. It's a little bit moved, 485 00:17:20,619 --> 00:17:23,108 but I've tried to repeat the the 486 00:17:23,109 --> 00:17:24,399 definitions up there. 487 00:17:24,400 --> 00:17:26,049 You can see that it matches roughly to 488 00:17:26,050 --> 00:17:28,149 the scale, you have the systems, you have 489 00:17:28,150 --> 00:17:30,159 the human driver and you have the other 490 00:17:30,160 --> 00:17:32,079 system components which end up being some 491 00:17:32,080 --> 00:17:33,309 driving modes. 492 00:17:33,310 --> 00:17:35,379 But some driving modes doesn't 493 00:17:35,380 --> 00:17:37,599 give you too much information, obviously. 494 00:17:38,630 --> 00:17:41,229 So while 495 00:17:41,230 --> 00:17:42,880 we're trying to 496 00:17:43,930 --> 00:17:45,999 get informed 497 00:17:46,000 --> 00:17:48,129 about the extent of assistance systems 498 00:17:48,130 --> 00:17:50,319 and how responsible they are, 499 00:17:50,320 --> 00:17:52,419 we get that and end up being really 500 00:17:52,420 --> 00:17:54,729 confused and that's a bit of a pity. 501 00:17:54,730 --> 00:17:56,679 And we can see this. 502 00:17:56,680 --> 00:17:58,989 I think we can we can move on 503 00:17:58,990 --> 00:18:00,729 to that. We are already so. 504 00:18:01,960 --> 00:18:04,569 So some of those subsystems, 505 00:18:04,570 --> 00:18:06,669 even in the definition, are explicitly 506 00:18:06,670 --> 00:18:08,739 excluded from the taxonomy 507 00:18:08,740 --> 00:18:09,999 that are supposed to describe the 508 00:18:10,000 --> 00:18:11,649 automation. So we have some automated 509 00:18:11,650 --> 00:18:13,599 subsystems explicitly excluded from the 510 00:18:13,600 --> 00:18:15,729 automation taxonomy. 511 00:18:15,730 --> 00:18:18,759 And to understand what that means is 512 00:18:18,760 --> 00:18:20,859 only possible if we look at what the heck 513 00:18:20,860 --> 00:18:22,299 we're talking about. So what are these 514 00:18:22,300 --> 00:18:25,509 ADAS systems that we, we 515 00:18:25,510 --> 00:18:28,119 we have to understand language wise 516 00:18:28,120 --> 00:18:30,519 and these features and basically 517 00:18:30,520 --> 00:18:31,719 boil down to perception. 518 00:18:31,720 --> 00:18:33,819 So we have this is that every, every 519 00:18:33,820 --> 00:18:36,009 autonomous car roughly 520 00:18:36,010 --> 00:18:38,169 has if and you can 521 00:18:38,170 --> 00:18:39,699 always argue with the wording and you 522 00:18:39,700 --> 00:18:41,169 could always argue with the nuances, but 523 00:18:41,170 --> 00:18:43,089 roughly you have a perceptual system and 524 00:18:43,090 --> 00:18:44,649 a decision making system and then you 525 00:18:44,650 --> 00:18:46,959 have actual orders of those decisions. 526 00:18:46,960 --> 00:18:49,419 But they consist 527 00:18:49,420 --> 00:18:51,639 roughly of depending on whether 528 00:18:51,640 --> 00:18:53,439 you use a Tesla or not, because Musk 529 00:18:53,440 --> 00:18:54,459 doesn't believe in later. 530 00:18:54,460 --> 00:18:55,460 But um. 531 00:18:56,740 --> 00:18:58,869 And for a good reason that I'll come back 532 00:18:58,870 --> 00:18:59,799 to later. 533 00:18:59,800 --> 00:19:02,259 But the idea is you have surround 534 00:19:02,260 --> 00:19:04,539 views around view cross-traffic alert, 535 00:19:04,540 --> 00:19:06,909 park assist, emergency braking, 536 00:19:06,910 --> 00:19:08,979 traffic sign recognition, lane departure 537 00:19:08,980 --> 00:19:10,629 warning, adaptive cruise control, 538 00:19:10,630 --> 00:19:12,399 collision avoidance, rear collision 539 00:19:12,400 --> 00:19:14,079 warning, surround view, park assist and 540 00:19:14,080 --> 00:19:15,459 all these kind of things. 541 00:19:15,460 --> 00:19:17,709 And they amount to different 542 00:19:17,710 --> 00:19:19,839 functions with different extents of 543 00:19:19,840 --> 00:19:20,979 automation, right? 544 00:19:20,980 --> 00:19:23,139 Lane keeping has more automation 545 00:19:23,140 --> 00:19:25,239 than the, well, 546 00:19:26,290 --> 00:19:28,239 I'm just not I'm not even going to going 547 00:19:28,240 --> 00:19:30,549 to go there because that's that's 548 00:19:30,550 --> 00:19:32,439 what is so difficult about it. 549 00:19:32,440 --> 00:19:34,829 So what's interesting and 550 00:19:34,830 --> 00:19:36,849 and necessary to understand in general is 551 00:19:36,850 --> 00:19:38,439 that we have perception a perception is 552 00:19:38,440 --> 00:19:41,289 made up of computer vision and sensor 553 00:19:41,290 --> 00:19:43,449 fusion, and 554 00:19:43,450 --> 00:19:44,559 it's all about understanding the 555 00:19:44,560 --> 00:19:45,549 environment. 556 00:19:45,550 --> 00:19:47,139 Computer vision uses cameras, and it 557 00:19:47,140 --> 00:19:49,299 allows to identify cars and pedestrian 558 00:19:49,300 --> 00:19:51,429 and roads, and sensor fusion uses and 559 00:19:51,430 --> 00:19:53,979 merges data from the sensors, 560 00:19:53,980 --> 00:19:56,769 such as a radar or a slider, 561 00:19:56,770 --> 00:19:58,779 or the complement data from the cameras 562 00:19:58,780 --> 00:20:01,329 or infrared when it's close to the car 563 00:20:01,330 --> 00:20:03,429 or all kinds of things, depending on the 564 00:20:03,430 --> 00:20:05,319 project we're talking about. 565 00:20:05,320 --> 00:20:07,419 But decision making 566 00:20:07,420 --> 00:20:09,849 is on the prediction and decision side, 567 00:20:09,850 --> 00:20:12,219 and it's not as perception 568 00:20:12,220 --> 00:20:14,049 doesn't seem to be yet, even though 569 00:20:14,050 --> 00:20:16,179 impressive. It's not 570 00:20:16,180 --> 00:20:18,189 developed sufficiently to just roll it 571 00:20:18,190 --> 00:20:19,929 out as they're claiming, and that's what 572 00:20:19,930 --> 00:20:21,279 I'm trying to get here. 573 00:20:21,280 --> 00:20:23,379 And while showing that 574 00:20:23,380 --> 00:20:25,509 it is, it is interesting to look 575 00:20:25,510 --> 00:20:26,510 at those things now. 576 00:20:29,730 --> 00:20:31,859 All these things, we can 577 00:20:31,860 --> 00:20:33,089 come back to enter the discussion, I've 578 00:20:33,090 --> 00:20:34,090 got to move on. 579 00:20:38,020 --> 00:20:39,020 Yeah, yeah. 580 00:20:40,150 --> 00:20:41,900 So. 581 00:20:43,900 --> 00:20:46,359 Oh, now we're going to talk about 582 00:20:46,360 --> 00:20:49,509 an automated, 583 00:20:49,510 --> 00:20:52,149 sorry assistant, automated 584 00:20:52,150 --> 00:20:53,979 driver assistance systems, that's what it 585 00:20:53,980 --> 00:20:56,049 is. And and the 586 00:20:56,050 --> 00:20:58,179 interaction with the driver. 587 00:20:58,180 --> 00:21:00,339 And there are two 588 00:21:00,340 --> 00:21:02,349 slides on this because and this goes back 589 00:21:02,350 --> 00:21:04,839 to what I mentioned in terms of the 590 00:21:04,840 --> 00:21:05,840 in strategy. 591 00:21:06,910 --> 00:21:07,910 The first. 592 00:21:09,020 --> 00:21:11,569 Let's say bulk of ADAS 593 00:21:11,570 --> 00:21:13,639 systems are supposed to take away the 594 00:21:13,640 --> 00:21:15,409 wheel to take away the need for you to 595 00:21:15,410 --> 00:21:17,629 take the wheel and the second bulk 596 00:21:17,630 --> 00:21:20,149 or what they are now recommending 597 00:21:20,150 --> 00:21:22,609 is basically making 598 00:21:22,610 --> 00:21:24,859 you see through in your decisions 599 00:21:24,860 --> 00:21:27,139 so that the system can land for you from 600 00:21:27,140 --> 00:21:29,509 you. Because what drivers 601 00:21:29,510 --> 00:21:30,510 can do 602 00:21:31,640 --> 00:21:33,739 is still much better than 603 00:21:33,740 --> 00:21:34,969 what cars can do. 604 00:21:34,970 --> 00:21:37,099 And since it didn't really 605 00:21:37,100 --> 00:21:39,469 work out to skip free to skip level 606 00:21:39,470 --> 00:21:41,599 three, they're now trying to come back at 607 00:21:41,600 --> 00:21:44,059 it. So while 608 00:21:44,060 --> 00:21:46,309 the downside of ADAS systems 609 00:21:46,310 --> 00:21:48,409 that doesn't seem so problematic, you 610 00:21:48,410 --> 00:21:50,209 know it's all confusing, which 611 00:21:50,210 --> 00:21:53,119 interestingly actually has practical 612 00:21:53,120 --> 00:21:54,859 consequences. People don't know what 613 00:21:54,860 --> 00:21:55,969 they're doing. 614 00:21:55,970 --> 00:21:58,249 And thusly, they're creating accidents. 615 00:21:58,250 --> 00:22:00,319 But also it's misleading 616 00:22:00,320 --> 00:22:02,179 in terms of wording. I had that to a 617 00:22:02,180 --> 00:22:04,489 point because I think they can take a nap 618 00:22:04,490 --> 00:22:06,589 while driving, which I found very 619 00:22:06,590 --> 00:22:08,659 interesting. And now if we 620 00:22:08,660 --> 00:22:10,879 if we look at we can, we can move. 621 00:22:10,880 --> 00:22:13,099 If we look at the next one, you can see 622 00:22:13,100 --> 00:22:15,049 that there are recommendations 623 00:22:15,050 --> 00:22:17,569 recommended escalating tension 624 00:22:17,570 --> 00:22:20,539 reminders for level two automation 625 00:22:20,540 --> 00:22:22,969 and level two automation again. 626 00:22:22,970 --> 00:22:24,709 We we started from zero. 627 00:22:24,710 --> 00:22:26,929 We want to go to five and we have level 628 00:22:26,930 --> 00:22:29,089 two right now, and 629 00:22:29,090 --> 00:22:31,699 it's already ending 630 00:22:31,700 --> 00:22:33,859 with the car taking over from 631 00:22:33,860 --> 00:22:35,629 you and locking you out. 632 00:22:35,630 --> 00:22:37,729 So so what I'm trying to say 633 00:22:37,730 --> 00:22:39,829 is first, they tried 634 00:22:39,830 --> 00:22:42,289 to erase the driver, and now 635 00:22:42,290 --> 00:22:44,329 they're there. They're doing everything 636 00:22:44,330 --> 00:22:46,759 to make the driver back up the system 637 00:22:46,760 --> 00:22:48,169 that they developed. 638 00:22:48,170 --> 00:22:50,839 And and to do that, 639 00:22:50,840 --> 00:22:52,909 the the consequences are the costs 640 00:22:52,910 --> 00:22:54,889 that come with it might just be not that 641 00:22:54,890 --> 00:22:56,509 desirable depending on what you're 642 00:22:56,510 --> 00:22:58,579 looking out for or have a problem 643 00:22:58,580 --> 00:23:00,649 with. And of course, I'm hinting 644 00:23:00,650 --> 00:23:02,599 to privacy later on. 645 00:23:02,600 --> 00:23:05,179 But but even 646 00:23:05,180 --> 00:23:07,639 if we leave out that very massive, 647 00:23:07,640 --> 00:23:09,619 unbelievably massive topic. 648 00:23:09,620 --> 00:23:11,749 Oh, and I didn't even go into, 649 00:23:11,750 --> 00:23:13,849 I think, observation 650 00:23:13,850 --> 00:23:16,069 and control in terms 651 00:23:16,070 --> 00:23:17,119 of the ambiguity. 652 00:23:18,230 --> 00:23:20,329 Um, even if 653 00:23:20,330 --> 00:23:22,489 we don't think about that privacy 654 00:23:22,490 --> 00:23:24,679 issue, that turn and development shows 655 00:23:24,680 --> 00:23:26,929 that the hype is is leading 656 00:23:26,930 --> 00:23:29,389 to developments. 657 00:23:29,390 --> 00:23:31,189 And I mean, massively and greatly pushed 658 00:23:31,190 --> 00:23:33,499 developments that might not just be 659 00:23:33,500 --> 00:23:34,609 that well aimed. 660 00:23:35,810 --> 00:23:36,810 So. 661 00:23:38,620 --> 00:23:41,199 If if the first 662 00:23:41,200 --> 00:23:43,389 galloping in this direction leads in 663 00:23:43,390 --> 00:23:45,549 galloping in the opposite direction 664 00:23:45,550 --> 00:23:47,079 and still doing it with a lot of 665 00:23:47,080 --> 00:23:49,239 enthusiasm, it 666 00:23:49,240 --> 00:23:51,309 might just be a thing to notice. 667 00:23:51,310 --> 00:23:53,589 And and it's not a it's 668 00:23:53,590 --> 00:23:55,929 not a grumpy point of 669 00:23:55,930 --> 00:23:58,359 pro progress, but you know, and 670 00:23:58,360 --> 00:24:00,130 the better and in a good way. 671 00:24:02,090 --> 00:24:04,669 Oh, OK, where are we? 672 00:24:04,670 --> 00:24:06,229 Failures in perception I've been 673 00:24:06,230 --> 00:24:08,239 mentioning, and I don't think I have the 674 00:24:08,240 --> 00:24:10,429 time to really explain what it boils 675 00:24:10,430 --> 00:24:12,799 down to. But here is 676 00:24:12,800 --> 00:24:15,169 what part from the driver 677 00:24:15,170 --> 00:24:17,359 not paying attention. 678 00:24:17,360 --> 00:24:19,549 One of the reasons why this 679 00:24:19,550 --> 00:24:22,220 is problematic is because 680 00:24:23,480 --> 00:24:25,069 because of the driver's confusion, but 681 00:24:25,070 --> 00:24:27,139 also because it's really unclear 682 00:24:27,140 --> 00:24:29,839 what the two automated systems 683 00:24:29,840 --> 00:24:31,129 can do. 684 00:24:31,130 --> 00:24:33,439 And I've been 685 00:24:33,440 --> 00:24:35,569 reading up on it, which led away 686 00:24:35,570 --> 00:24:38,659 on 73 mapping, 2073 687 00:24:38,660 --> 00:24:39,660 mapping. 688 00:24:40,730 --> 00:24:41,730 Hang on. 689 00:24:44,520 --> 00:24:45,520 Go on. Hang on. 690 00:24:46,470 --> 00:24:47,519 OK, that's good. 691 00:24:47,520 --> 00:24:49,529 OK, so. 692 00:24:50,780 --> 00:24:52,249 So localization, we're talking about 693 00:24:52,250 --> 00:24:53,959 localization right now, this is the my 694 00:24:53,960 --> 00:24:56,299 pre mapping slide, which which hints 695 00:24:56,300 --> 00:24:58,079 to the fact that you need specialized 696 00:24:58,080 --> 00:25:00,169 skills code for pre mapping, and 697 00:25:00,170 --> 00:25:01,789 that seems to be a rather difficult 698 00:25:01,790 --> 00:25:03,259 issue. 699 00:25:03,260 --> 00:25:05,480 More importantly, though, localizing. 700 00:25:07,530 --> 00:25:10,889 Which is making the jeeps 701 00:25:10,890 --> 00:25:12,989 signal or which is complementing 702 00:25:12,990 --> 00:25:15,139 the signal with other technologies, 703 00:25:15,140 --> 00:25:17,069 so you really know where you are and not 704 00:25:17,070 --> 00:25:18,930 only in the range of 10 meters basically 705 00:25:20,580 --> 00:25:21,580 is. 706 00:25:22,360 --> 00:25:24,549 This is a lighter, specific 707 00:25:24,550 --> 00:25:26,439 problem, because it's about keeping the 708 00:25:26,440 --> 00:25:27,440 maps 709 00:25:29,410 --> 00:25:31,430 current and 710 00:25:32,980 --> 00:25:33,980 all that and. 711 00:25:36,070 --> 00:25:37,809 Well, anyway, the pre mapping. 712 00:25:37,810 --> 00:25:39,939 Yeah, so the the issue is that if 713 00:25:39,940 --> 00:25:41,859 the map changes and you can see that up 714 00:25:41,860 --> 00:25:44,139 there really frequently as 715 00:25:44,140 --> 00:25:45,819 it does, that if it changes too much, you 716 00:25:45,820 --> 00:25:48,459 can actually lose your localization. 717 00:25:48,460 --> 00:25:50,679 And and that is needed true 718 00:25:50,680 --> 00:25:52,029 for the car to know where it is and so 719 00:25:52,030 --> 00:25:53,079 on. 720 00:25:53,080 --> 00:25:55,059 And that in the end, this advanced 721 00:25:55,060 --> 00:25:56,589 technology presents a drawback to the 722 00:25:56,590 --> 00:25:57,879 self-driving cars. 723 00:25:57,880 --> 00:25:59,949 And I mean, moving on the 724 00:25:59,950 --> 00:26:01,719 weather thing we can just skip because we 725 00:26:01,720 --> 00:26:02,659 knew that already. 726 00:26:02,660 --> 00:26:04,389 Interestingly, there are developments 727 00:26:04,390 --> 00:26:06,549 that can see in the dark, but how 728 00:26:06,550 --> 00:26:08,709 fast they can ride and whether you 729 00:26:08,710 --> 00:26:10,509 want to sit in them in all situations is 730 00:26:10,510 --> 00:26:12,399 a completely different question. 731 00:26:12,400 --> 00:26:14,739 And now we come to, as opposed 732 00:26:14,740 --> 00:26:17,229 to the problems with recognizing 733 00:26:17,230 --> 00:26:19,539 objects and classifying and classifying 734 00:26:19,540 --> 00:26:21,789 them properly, which is perception. 735 00:26:21,790 --> 00:26:23,799 We talk about prediction. 736 00:26:23,800 --> 00:26:25,390 And again, even though impressive, 737 00:26:26,680 --> 00:26:28,149 it's a it's an open problem. 738 00:26:28,150 --> 00:26:30,009 If you can predict the future accurately, 739 00:26:30,010 --> 00:26:31,629 then planning and how to react to those 740 00:26:31,630 --> 00:26:32,949 situations is easy to solve. 741 00:26:32,950 --> 00:26:35,379 That sounds like an eight equals a 742 00:26:35,380 --> 00:26:36,380 sentence, but 743 00:26:37,780 --> 00:26:39,789 being able to predict the future actions 744 00:26:39,790 --> 00:26:41,979 of recognized objects in autonomous 745 00:26:41,980 --> 00:26:44,619 car computing is an open problem, and 746 00:26:44,620 --> 00:26:47,229 that is Dr Eustace 747 00:26:47,230 --> 00:26:49,449 from Toyota, who is 748 00:26:49,450 --> 00:26:51,969 the SVP for automated driving and 749 00:26:53,320 --> 00:26:54,320 AM. 750 00:26:55,660 --> 00:26:58,209 The issue about 751 00:26:58,210 --> 00:27:01,269 the issue about this is that 752 00:27:01,270 --> 00:27:03,309 you have very specific problems that the 753 00:27:03,310 --> 00:27:04,329 car needs to solve. 754 00:27:04,330 --> 00:27:05,889 Just a second. 755 00:27:05,890 --> 00:27:08,589 And that is the semantic 756 00:27:08,590 --> 00:27:10,629 recognition of something not only the 757 00:27:10,630 --> 00:27:12,429 understanding of the surrounding, not 758 00:27:12,430 --> 00:27:15,069 only the the perception 759 00:27:15,070 --> 00:27:16,539 of the surroundings. 760 00:27:16,540 --> 00:27:18,219 So if you accept these surroundings, you 761 00:27:18,220 --> 00:27:20,499 can see people on the side 762 00:27:20,500 --> 00:27:22,029 of the road. But if you understand the 763 00:27:22,030 --> 00:27:23,769 surroundings, you understand the 764 00:27:23,770 --> 00:27:25,659 difference between teenagers who might be 765 00:27:25,660 --> 00:27:27,909 erratic and run onto the street to use 766 00:27:27,910 --> 00:27:29,380 the obligatory example, 767 00:27:30,700 --> 00:27:33,129 or an older lady and a younger child 768 00:27:33,130 --> 00:27:34,130 who very 769 00:27:35,440 --> 00:27:37,569 conscientiously wait for the lights to 770 00:27:37,570 --> 00:27:39,339 turn. And that's the difference that 771 00:27:39,340 --> 00:27:41,409 humans are much better in 772 00:27:41,410 --> 00:27:43,149 recognizing than cars. 773 00:27:43,150 --> 00:27:45,759 And if we look at the ethical problems. 774 00:27:45,760 --> 00:27:46,929 So we have perception. 775 00:27:46,930 --> 00:27:48,849 Once you have precept it correctly, if 776 00:27:48,850 --> 00:27:50,469 you want to distinguish it like that, you 777 00:27:50,470 --> 00:27:52,749 have to predict correctly what those 778 00:27:52,750 --> 00:27:54,249 objects are going to do, which is a whole 779 00:27:54,250 --> 00:27:56,079 'nother question. And then once you've 780 00:27:56,080 --> 00:27:57,879 done that, you still got some ethical 781 00:27:57,880 --> 00:28:00,309 problems, which are usually 782 00:28:00,310 --> 00:28:02,949 explained by the trolley problem now 783 00:28:02,950 --> 00:28:04,419 due to a couple of reasons which will 784 00:28:04,420 --> 00:28:06,129 become clear with a next slide. 785 00:28:06,130 --> 00:28:08,469 I've put this in only as a joke, because 786 00:28:08,470 --> 00:28:10,239 that probably trolley problem, as it 787 00:28:10,240 --> 00:28:11,649 turns out, doesn't give us too much 788 00:28:11,650 --> 00:28:13,569 information for the development of 789 00:28:13,570 --> 00:28:14,739 autonomous cars, neither on the 790 00:28:14,740 --> 00:28:16,569 programing nor on the on the ethical 791 00:28:16,570 --> 00:28:19,389 side. Although it focuses our attention 792 00:28:19,390 --> 00:28:21,609 and thusly, it's not to be missed 793 00:28:21,610 --> 00:28:23,679 as a point or a topic 794 00:28:23,680 --> 00:28:26,739 focuses our attention on 795 00:28:26,740 --> 00:28:29,529 country and questions of responsibility 796 00:28:29,530 --> 00:28:32,799 and autonomy or utilitarian 797 00:28:32,800 --> 00:28:35,260 questions. For example, like in mills, 798 00:28:36,460 --> 00:28:39,279 you're the term determinism 799 00:28:39,280 --> 00:28:41,319 which need to be thought through if we 800 00:28:41,320 --> 00:28:43,209 want to be able to structure society 801 00:28:43,210 --> 00:28:44,529 properly. 802 00:28:44,530 --> 00:28:46,389 We can't just leave ethics out, and that 803 00:28:46,390 --> 00:28:48,489 seems obvious. 804 00:28:48,490 --> 00:28:50,499 But I'm just going to make that point 805 00:28:50,500 --> 00:28:51,949 once more. 806 00:28:51,950 --> 00:28:54,369 And and here you can see driver 807 00:28:54,370 --> 00:28:56,739 versus pedestrian and cyclist is another 808 00:28:56,740 --> 00:28:57,669 version of this. 809 00:28:57,670 --> 00:28:59,769 And this just roughly says what I 810 00:28:59,770 --> 00:29:02,349 just told you. There has been a lot of 811 00:29:02,350 --> 00:29:04,539 hype around the trolley problem. 812 00:29:04,540 --> 00:29:06,969 But but in the end, 813 00:29:06,970 --> 00:29:09,099 the information that we get out of it is 814 00:29:09,100 --> 00:29:11,349 rather restricted as opposed to 815 00:29:11,350 --> 00:29:13,179 situations that could actually happen to 816 00:29:13,180 --> 00:29:14,180 you as a driver. 817 00:29:18,210 --> 00:29:20,459 Which brings us to fatal crashes due 818 00:29:20,460 --> 00:29:21,839 to perception failure. 819 00:29:23,760 --> 00:29:26,039 And I'm just going to go 820 00:29:26,040 --> 00:29:27,749 quickly because we've made that point a 821 00:29:27,750 --> 00:29:28,980 couple of times now. 822 00:29:31,820 --> 00:29:33,559 There is the 823 00:29:35,600 --> 00:29:36,600 uh, 824 00:29:38,090 --> 00:29:40,609 I think, to 825 00:29:40,610 --> 00:29:42,739 well, basically throw on 826 00:29:42,740 --> 00:29:45,029 way over the perception issue. 827 00:29:45,030 --> 00:29:47,779 Oh, listen, and I'll come to this bit. 828 00:29:47,780 --> 00:29:49,999 And I think there is one more slide about 829 00:29:50,000 --> 00:29:51,949 security, but I'm happy to give over 830 00:29:51,950 --> 00:29:52,729 right now 831 00:29:52,730 --> 00:29:55,429 as we're talking here. 832 00:29:55,430 --> 00:29:56,779 Next slide is around. 833 00:29:58,310 --> 00:30:00,619 And this is really interesting. 834 00:30:00,620 --> 00:30:02,899 We talked about the fatal crash with Uber 835 00:30:02,900 --> 00:30:04,849 in 2018, and what was interesting here 836 00:30:04,850 --> 00:30:07,219 was when it was investigated by American 837 00:30:07,220 --> 00:30:08,220 authorities. 838 00:30:09,200 --> 00:30:11,689 They found what they called a cascade 839 00:30:11,690 --> 00:30:13,999 of design failures all the way through 840 00:30:15,200 --> 00:30:17,479 the process. 841 00:30:17,480 --> 00:30:19,999 The car itself had six seconds 842 00:30:20,000 --> 00:30:22,039 to determine what was in front of it, 843 00:30:22,040 --> 00:30:24,229 what objects. It was alternating between 844 00:30:24,230 --> 00:30:26,029 different things. Every time it 845 00:30:26,030 --> 00:30:28,309 alternated between thinking it was a bike 846 00:30:28,310 --> 00:30:30,449 to an object, a person, it lost 847 00:30:30,450 --> 00:30:32,509 the memory of the movement of the person, 848 00:30:32,510 --> 00:30:34,909 so it couldn't actually adjust 849 00:30:34,910 --> 00:30:37,099 its position, according to the situation 850 00:30:37,100 --> 00:30:38,299 it found itself in. 851 00:30:38,300 --> 00:30:40,459 And then when it got close enough, there 852 00:30:40,460 --> 00:30:42,409 was an action suppression system that 853 00:30:42,410 --> 00:30:44,419 kicked in to prevent sudden movements, 854 00:30:44,420 --> 00:30:46,189 which prevented it from handing over to 855 00:30:46,190 --> 00:30:47,489 the driver. 856 00:30:47,490 --> 00:30:50,119 And what's interesting about this is 857 00:30:50,120 --> 00:30:52,699 the level of failure to the 858 00:30:52,700 --> 00:30:54,949 the authorities found there with 859 00:30:54,950 --> 00:30:57,019 Uber and also 860 00:30:57,020 --> 00:30:59,719 the safety failures the regime which 861 00:30:59,720 --> 00:31:02,119 oversaw the safety drivers. 862 00:31:02,120 --> 00:31:03,589 They didn't drug test, they didn't. 863 00:31:03,590 --> 00:31:05,299 There was no oversight. 864 00:31:05,300 --> 00:31:07,579 And yet when it came down to it 865 00:31:07,580 --> 00:31:10,399 and they in the end 866 00:31:10,400 --> 00:31:12,439 ended up charging the driver and not 867 00:31:12,440 --> 00:31:14,539 actually sanctioning the Uber in any way 868 00:31:14,540 --> 00:31:16,489 and coming back to this point, arrest was 869 00:31:16,490 --> 00:31:18,859 making about the safety issues. 870 00:31:18,860 --> 00:31:20,779 Who's actually going to be responsible 871 00:31:20,780 --> 00:31:22,550 for a fatal crash, however safe 872 00:31:24,080 --> 00:31:26,239 these cars are, they are always 873 00:31:26,240 --> 00:31:28,069 going to create some kinds of fatalities 874 00:31:28,070 --> 00:31:29,569 is inevitable with the scale of 875 00:31:29,570 --> 00:31:32,029 automotive transport. 876 00:31:32,030 --> 00:31:33,829 But who's going to actually be 877 00:31:33,830 --> 00:31:34,819 accountable for it? 878 00:31:34,820 --> 00:31:37,039 And this is not 879 00:31:37,040 --> 00:31:38,059 a good precedent. 880 00:31:39,170 --> 00:31:41,749 So we talked about cybersecurity. 881 00:31:41,750 --> 00:31:43,999 Should I just quickly case 882 00:31:44,000 --> 00:31:46,099 I can do that one just very quickly, 883 00:31:46,100 --> 00:31:48,539 just because this is a very specific one. 884 00:31:48,540 --> 00:31:50,809 And if you see 885 00:31:50,810 --> 00:31:53,329 the headline, it says 886 00:31:54,350 --> 00:31:56,429 a study on a can you can you put 887 00:31:56,430 --> 00:31:58,249 in the headline interview? 888 00:31:58,250 --> 00:31:59,149 Yes. Thank you. 889 00:31:59,150 --> 00:32:00,979 Automotive industry cybersecurity 890 00:32:00,980 --> 00:32:03,259 practices from measured 891 00:32:03,260 --> 00:32:05,539 or assessed in an independent 892 00:32:05,540 --> 00:32:07,489 study by the Commission by the FAA 893 00:32:07,490 --> 00:32:08,629 International. It's enough there's now 894 00:32:08,630 --> 00:32:10,969 synopsis sells software 895 00:32:10,970 --> 00:32:13,159 for autonomous driving so we can see 896 00:32:13,160 --> 00:32:15,619 where that is coming from, but those guys 897 00:32:15,620 --> 00:32:17,749 are trying to get it into 898 00:32:17,750 --> 00:32:19,369 boxes that we can work with. 899 00:32:19,370 --> 00:32:20,370 So 900 00:32:21,890 --> 00:32:24,409 those are the key results 901 00:32:24,410 --> 00:32:25,459 from this study. 902 00:32:25,460 --> 00:32:27,919 And the study is not only on 903 00:32:27,920 --> 00:32:30,199 a connected or autonomous 904 00:32:30,200 --> 00:32:32,929 cars, but on new cybersecurity 905 00:32:32,930 --> 00:32:34,459 and auto mobile in the models and global 906 00:32:34,460 --> 00:32:36,799 industry. And this is just 907 00:32:36,800 --> 00:32:39,229 this. This point is just a bigger 908 00:32:39,230 --> 00:32:40,819 explanation or a longer explanation of 909 00:32:40,820 --> 00:32:43,009 this one. And the three key points 910 00:32:43,010 --> 00:32:44,749 are software security is not keeping pace 911 00:32:44,750 --> 00:32:47,539 with technology in the auto industry. 912 00:32:47,540 --> 00:32:49,759 Software and the auto supply automotive 913 00:32:49,760 --> 00:32:52,009 supply chain presents a major 914 00:32:52,010 --> 00:32:54,139 risk, and that's an issue that will lead 915 00:32:54,140 --> 00:32:56,419 us back to proprietary versus open 916 00:32:56,420 --> 00:32:58,699 software issues, among 917 00:32:58,700 --> 00:33:00,889 other things, because the software comes 918 00:33:00,890 --> 00:33:02,989 from third party suppliers and 919 00:33:02,990 --> 00:33:05,119 sometimes you know, the 920 00:33:05,120 --> 00:33:06,859 OEMs have to superimpose 921 00:33:08,060 --> 00:33:09,949 things to them to make it more secure. 922 00:33:09,950 --> 00:33:11,569 And these guys, we can go into this under 923 00:33:11,570 --> 00:33:13,639 discussion and and connected 924 00:33:13,640 --> 00:33:15,619 vehicles have a unique security issues. 925 00:33:15,620 --> 00:33:17,269 I mean, we could all have guessed that 926 00:33:17,270 --> 00:33:19,359 one. But that's just what 927 00:33:19,360 --> 00:33:21,679 I wanted you to throw out there because 928 00:33:21,680 --> 00:33:23,209 I found they did. 929 00:33:23,210 --> 00:33:24,980 They do have some interesting 930 00:33:26,600 --> 00:33:28,399 questionnaires with people from the 931 00:33:28,400 --> 00:33:30,629 industry and 932 00:33:30,630 --> 00:33:31,909 and from science as well. 933 00:33:31,910 --> 00:33:34,159 Okay. And then when we're talking about 934 00:33:34,160 --> 00:33:36,289 these cybersecurity issues, we can 935 00:33:36,290 --> 00:33:37,999 also we need to talk about the data and 936 00:33:38,000 --> 00:33:39,000 privacy issues. 937 00:33:40,040 --> 00:33:42,199 Tesla, currently on the road today, is 938 00:33:42,200 --> 00:33:44,569 equipped with hardware for 939 00:33:44,570 --> 00:33:46,699 autonomous driving, and that means 940 00:33:46,700 --> 00:33:48,799 it has eight eight 360 941 00:33:48,800 --> 00:33:51,019 degree high definition cameras 942 00:33:51,020 --> 00:33:53,389 all recording. It constantly has 12 943 00:33:53,390 --> 00:33:54,949 ultrasonic sensors. 944 00:33:54,950 --> 00:33:55,950 It has 945 00:33:57,080 --> 00:33:59,179 GPS. It has an inertial gauge, 946 00:33:59,180 --> 00:34:01,309 even actually monitors 947 00:34:01,310 --> 00:34:02,929 the pedals and the steering. 948 00:34:02,930 --> 00:34:05,359 All of that information is being shared 949 00:34:05,360 --> 00:34:08,178 with its sensors with Tesla, 950 00:34:08,179 --> 00:34:11,198 and it can be recording even when 951 00:34:11,199 --> 00:34:13,428 the car is stationary and actually office 952 00:34:13,429 --> 00:34:14,479 recording all the time. 953 00:34:14,480 --> 00:34:16,099 So it's not just recording the people in 954 00:34:16,100 --> 00:34:17,899 the car, it's recording all of its 955 00:34:17,900 --> 00:34:19,729 surrounding area and all of the people 956 00:34:19,730 --> 00:34:21,198 there as well. 957 00:34:21,199 --> 00:34:22,638 Research suggests that there's something 958 00:34:22,639 --> 00:34:25,099 like potentially a fully 959 00:34:25,100 --> 00:34:26,809 autonomous vehicle could actually be 960 00:34:26,810 --> 00:34:29,238 sharing 40 terabytes of data every 961 00:34:29,239 --> 00:34:30,408 eight hours. 962 00:34:30,409 --> 00:34:32,669 And then we have. Cyber security issue. 963 00:34:32,670 --> 00:34:34,859 What if we have malware, for example, in 964 00:34:34,860 --> 00:34:37,499 a car? It's one thing if you have it in a 965 00:34:37,500 --> 00:34:39,839 computer at home, when that computer 966 00:34:39,840 --> 00:34:42,388 is in two tons of metal, going at 120 967 00:34:42,389 --> 00:34:44,158 kilometers an hour, that's a bit of a 968 00:34:44,159 --> 00:34:46,229 problem. And it's not just malware for 969 00:34:46,230 --> 00:34:47,459 the car. 970 00:34:47,460 --> 00:34:49,559 A lot of researchers are concerned 971 00:34:49,560 --> 00:34:52,198 about passive hacking that is 972 00:34:52,199 --> 00:34:54,689 doing things contaminating 973 00:34:54,690 --> 00:34:55,799 the environment with different 974 00:34:55,800 --> 00:34:57,989 information. For example, 975 00:34:57,990 --> 00:34:59,969 with road signs which really screw up 976 00:34:59,970 --> 00:35:02,129 what the perception of a 977 00:35:02,130 --> 00:35:03,899 self-driving car system would be. 978 00:35:03,900 --> 00:35:06,089 This all could be things with fatal 979 00:35:06,090 --> 00:35:07,169 consequences. 980 00:35:07,170 --> 00:35:09,179 So there are massive issues that 981 00:35:09,180 --> 00:35:11,459 I would like to add that we are 982 00:35:11,460 --> 00:35:13,229 aware of the fact that recording and 983 00:35:13,230 --> 00:35:15,179 monitoring is not the same thing and 984 00:35:15,180 --> 00:35:17,249 monitoring something and recording 985 00:35:17,250 --> 00:35:19,529 it doesn't have to go 986 00:35:19,530 --> 00:35:21,239 together. It usually does, though, 987 00:35:21,240 --> 00:35:23,429 because for a couple of reasons, 988 00:35:23,430 --> 00:35:25,589 mostly because if you wouldn't need the 989 00:35:25,590 --> 00:35:27,329 data afterwards, why would you monitor 990 00:35:27,330 --> 00:35:28,949 reinforced first place? 991 00:35:28,950 --> 00:35:30,900 And those are things to think about. 992 00:35:31,950 --> 00:35:32,950 Sure. 993 00:35:33,870 --> 00:35:35,249 So let's see. 994 00:35:35,250 --> 00:35:37,499 Yeah. So having seen all 995 00:35:37,500 --> 00:35:39,389 of the complexities and difficulties and 996 00:35:39,390 --> 00:35:41,669 challenges we may want to revisit, 997 00:35:41,670 --> 00:35:44,729 why exactly do we need self-driving 998 00:35:44,730 --> 00:35:46,559 cars anyway? 999 00:35:46,560 --> 00:35:48,929 And one of the obvious answers 1000 00:35:48,930 --> 00:35:49,930 we have to that, 1001 00:35:50,990 --> 00:35:53,159 which is the frequently said it's going 1002 00:35:53,160 --> 00:35:55,349 to someday be an immense boost to 1003 00:35:55,350 --> 00:35:56,249 our economy. 1004 00:35:56,250 --> 00:35:57,959 It's going to have. One report recently 1005 00:35:57,960 --> 00:35:59,579 said it's going to add seven trillion 1006 00:35:59,580 --> 00:36:01,059 dollars to the world economy. 1007 00:36:01,060 --> 00:36:03,209 That's twice the size of the economy 1008 00:36:03,210 --> 00:36:04,439 of Germany. 1009 00:36:04,440 --> 00:36:06,209 And how is it going to do that? 1010 00:36:06,210 --> 00:36:08,249 Well, when you look into the report, you 1011 00:36:08,250 --> 00:36:10,169 can see where they're going and second, 1012 00:36:10,170 --> 00:36:11,609 saying that it's going to be three point 1013 00:36:11,610 --> 00:36:13,829 seven trillion dollars spent 1014 00:36:13,830 --> 00:36:15,569 on mobility as a service. 1015 00:36:15,570 --> 00:36:17,849 In other words, taxes, that's 1016 00:36:17,850 --> 00:36:18,749 an awful lot of money. 1017 00:36:18,750 --> 00:36:20,819 That's three that's more money than 1018 00:36:20,820 --> 00:36:23,129 the entire automotive industry generates 1019 00:36:23,130 --> 00:36:24,659 currently, which is about three trillion 1020 00:36:24,660 --> 00:36:25,799 dollars. 1021 00:36:25,800 --> 00:36:27,059 So that's a lot of money we're spending 1022 00:36:27,060 --> 00:36:28,019 on taxes. 1023 00:36:28,020 --> 00:36:30,089 Is that going to make us richer is as an 1024 00:36:30,090 --> 00:36:31,769 economy, as a society. 1025 00:36:31,770 --> 00:36:33,449 It's hard to see that, really. 1026 00:36:33,450 --> 00:36:35,099 And likewise, they looking at freight and 1027 00:36:35,100 --> 00:36:37,019 transport and three trillion dollars 1028 00:36:37,020 --> 00:36:39,209 spent on autonomous vehicles, well, 1029 00:36:39,210 --> 00:36:40,379 that could be more efficient. 1030 00:36:40,380 --> 00:36:42,809 But of course, we have a huge number 1031 00:36:42,810 --> 00:36:45,059 of of of 1032 00:36:45,060 --> 00:36:47,159 a huge workforce employed in 1033 00:36:47,160 --> 00:36:49,529 the train frame transport industry. 1034 00:36:49,530 --> 00:36:51,659 There was something like five million 1035 00:36:51,660 --> 00:36:53,819 drivers professionally 1036 00:36:53,820 --> 00:36:54,719 in Europe alone. 1037 00:36:54,720 --> 00:36:56,099 What happens to them? 1038 00:36:56,100 --> 00:36:58,139 Is this really a great idea for our 1039 00:36:58,140 --> 00:37:00,329 economy? And fundamentally, 1040 00:37:00,330 --> 00:37:02,519 is it actually going to make us that much 1041 00:37:02,520 --> 00:37:04,859 richer if when we're in a car instead 1042 00:37:04,860 --> 00:37:06,329 of driving it, we're looking at our 1043 00:37:06,330 --> 00:37:08,279 emails? It's hard to see that 1044 00:37:09,600 --> 00:37:11,699 safety is the other big 1045 00:37:11,700 --> 00:37:14,039 argument that's often made 1046 00:37:14,040 --> 00:37:16,319 his. This is taken from Waymo's website. 1047 00:37:16,320 --> 00:37:18,569 It says one point thirty five million 1048 00:37:18,570 --> 00:37:21,179 deaths every year, 94 1049 00:37:21,180 --> 00:37:22,949 percent. This is a statistic you often 1050 00:37:22,950 --> 00:37:24,509 hear when you look at things around 1051 00:37:24,510 --> 00:37:26,729 self-driving cars. 94 percent of 1052 00:37:26,730 --> 00:37:28,799 accidents are caused by human 1053 00:37:28,800 --> 00:37:30,869 error, the implication being that 1054 00:37:30,870 --> 00:37:32,969 somehow autonomous vehicles will 1055 00:37:32,970 --> 00:37:35,489 actually address all of those. 1056 00:37:35,490 --> 00:37:37,469 But a lot of research is a question that 1057 00:37:37,470 --> 00:37:39,779 and said, well, actually only a third 1058 00:37:39,780 --> 00:37:41,819 of those accidents are going to be 1059 00:37:41,820 --> 00:37:43,949 avoided by autonomous vehicles, even 1060 00:37:43,950 --> 00:37:45,749 when there are humans involved. 1061 00:37:45,750 --> 00:37:47,699 There's nothing an autonomous vehicle can 1062 00:37:47,700 --> 00:37:49,829 do about a pedestrian in the street, 1063 00:37:49,830 --> 00:37:52,229 for example, to a to avoid the crash. 1064 00:37:52,230 --> 00:37:54,779 So the idea of safety is a big, 1065 00:37:54,780 --> 00:37:56,489 big question. It's an assumption, but 1066 00:37:56,490 --> 00:37:58,799 there's no real data to support 1067 00:37:58,800 --> 00:38:01,079 that. What we know is that autonomous 1068 00:38:01,080 --> 00:38:03,209 vehicles can be reasonably safe 1069 00:38:03,210 --> 00:38:05,339 in controlled environments, 1070 00:38:05,340 --> 00:38:08,009 but that's not the same as a normal city. 1071 00:38:08,010 --> 00:38:10,259 And then we're given this kind of vision. 1072 00:38:10,260 --> 00:38:11,369 This is Berlin. 1073 00:38:11,370 --> 00:38:13,079 This is a lovely Berlin a few years in 1074 00:38:13,080 --> 00:38:15,899 the future, as created by Daimler, 1075 00:38:15,900 --> 00:38:18,509 and this is from a report from Synopsis, 1076 00:38:18,510 --> 00:38:20,459 another company in this whole 1077 00:38:20,460 --> 00:38:22,739 self-driving car industry about how 1078 00:38:22,740 --> 00:38:24,539 it's going to reduce congestion. 1079 00:38:24,540 --> 00:38:26,409 This is a very popular argument. 1080 00:38:26,410 --> 00:38:29,009 Also, how is going to cut transportation 1081 00:38:29,010 --> 00:38:29,999 costs by 40 percent? 1082 00:38:30,000 --> 00:38:32,099 Hard to see how it's going to do that 1083 00:38:32,100 --> 00:38:34,649 with the cost going into the research 1084 00:38:34,650 --> 00:38:35,939 and how it's going to improve our 1085 00:38:35,940 --> 00:38:37,739 walkability and livability. 1086 00:38:37,740 --> 00:38:40,469 This congestion issue keeps coming up. 1087 00:38:40,470 --> 00:38:42,599 And the idea, presumably is that if 1088 00:38:42,600 --> 00:38:44,789 you have a whole fleet 1089 00:38:44,790 --> 00:38:46,859 of autonomous vehicles, they can just 1090 00:38:46,860 --> 00:38:49,329 drive bumper to bumper at 70 kilometers 1091 00:38:49,330 --> 00:38:51,659 an hour and be hugely efficient. 1092 00:38:51,660 --> 00:38:53,369 But it doesn't really work like that for 1093 00:38:53,370 --> 00:38:54,389 once. 1094 00:38:54,390 --> 00:38:56,039 For a start, you've got autonomous 1095 00:38:56,040 --> 00:38:58,199 vehicles, almost certainly for the next 1096 00:38:58,200 --> 00:39:01,229 few decades, even if they are exist 1097 00:39:01,230 --> 00:39:03,179 working with normal traffic. 1098 00:39:03,180 --> 00:39:05,429 How is that going to be more efficient? 1099 00:39:05,430 --> 00:39:07,169 And evidence suggests to actually 1100 00:39:07,170 --> 00:39:09,299 autonomous vehicles could increase 1101 00:39:09,300 --> 00:39:11,459 traffic congestion as people start 1102 00:39:11,460 --> 00:39:13,259 using them for completely frivolous 1103 00:39:13,260 --> 00:39:15,030 journeys if they're not actually in them? 1104 00:39:16,380 --> 00:39:18,749 And so the issue around congestion 1105 00:39:18,750 --> 00:39:21,509 is is very questionable indeed. 1106 00:39:21,510 --> 00:39:23,489 And indeed, traffic planners also point 1107 00:39:23,490 --> 00:39:25,619 to public transport, and 1108 00:39:25,620 --> 00:39:27,689 highways today can carry a 1109 00:39:27,690 --> 00:39:29,819 maximum of 2000 people 1110 00:39:29,820 --> 00:39:31,489 cars per hour. 1111 00:39:31,490 --> 00:39:33,109 If you're very optimistic about 1112 00:39:33,110 --> 00:39:35,179 autonomous vehicles, you could possibly 1113 00:39:35,180 --> 00:39:36,589 quadruple that, but that's really 1114 00:39:36,590 --> 00:39:39,589 stretching it, but a 1115 00:39:39,590 --> 00:39:42,259 good public transport system 1116 00:39:42,260 --> 00:39:44,719 will transport 50000 people 1117 00:39:44,720 --> 00:39:46,939 per hour, and there's no way 1118 00:39:46,940 --> 00:39:49,249 as this this urban planner says 1119 00:39:49,250 --> 00:39:51,679 no technology can overcome that basic 1120 00:39:51,680 --> 00:39:53,149 geometry. 1121 00:39:53,150 --> 00:39:56,119 Maybe, maybe just as a side note, 1122 00:39:56,120 --> 00:39:58,669 if automation could eliminate all 1123 00:39:58,670 --> 00:40:01,729 involved driver related factors, 1124 00:40:01,730 --> 00:40:03,839 then that would that would help a lot. 1125 00:40:03,840 --> 00:40:06,059 It's a big if and 1126 00:40:06,060 --> 00:40:08,179 and there is actually numbers out there 1127 00:40:08,180 --> 00:40:10,189 that show that even with the increasing 1128 00:40:10,190 --> 00:40:11,839 automation level, you don't get that much 1129 00:40:11,840 --> 00:40:15,169 out of it. It's like there are numbers. 1130 00:40:15,170 --> 00:40:16,369 I'll have to check it out. 1131 00:40:16,370 --> 00:40:18,769 But the point is that those 1132 00:40:18,770 --> 00:40:20,899 assistant functions work well, either 1133 00:40:20,900 --> 00:40:22,909 on higher speed roadways that are so 1134 00:40:22,910 --> 00:40:25,069 framed that it works anyway. 1135 00:40:25,070 --> 00:40:26,509 And that's not where you get usually the 1136 00:40:26,510 --> 00:40:27,510 crashes. 1137 00:40:28,670 --> 00:40:30,319 That's not where you usually get the 1138 00:40:30,320 --> 00:40:33,409 crashes, or it's 1139 00:40:33,410 --> 00:40:35,509 it's even if 1140 00:40:35,510 --> 00:40:38,479 that those ones were excluded. 1141 00:40:38,480 --> 00:40:41,959 It would still just mean like 15, 1142 00:40:41,960 --> 00:40:44,209 only 17 percent fewer deaths at nine 1143 00:40:44,210 --> 00:40:45,589 fewer injuries, nine percent fewer 1144 00:40:45,590 --> 00:40:47,419 injuries or something like that. 1145 00:40:47,420 --> 00:40:49,309 So I mean, we have to look at the numbers 1146 00:40:49,310 --> 00:40:50,719 properly, but this is this is an 1147 00:40:50,720 --> 00:40:51,919 interesting thing to note. 1148 00:40:51,920 --> 00:40:54,319 So one of the other challenges 1149 00:40:54,320 --> 00:40:56,449 here, as we said, is it's so difficult 1150 00:40:56,450 --> 00:40:58,159 for cars to perceive their environment 1151 00:40:58,160 --> 00:41:00,439 and inevitably because of the industry 1152 00:41:00,440 --> 00:41:02,329 and the scale of it. You're seeing the 1153 00:41:02,330 --> 00:41:04,309 alternative and this is from Andrew Ng, 1154 00:41:04,310 --> 00:41:05,989 who's one of the most prominent A.I. 1155 00:41:05,990 --> 00:41:08,269 researchers in the world today. 1156 00:41:08,270 --> 00:41:09,949 He's he wrote an article saying 1157 00:41:09,950 --> 00:41:12,139 self-driving cars won't work until we 1158 00:41:12,140 --> 00:41:13,999 change our roads and attitudes. 1159 00:41:14,000 --> 00:41:16,279 It's up to us to adapt to them. 1160 00:41:16,280 --> 00:41:18,619 And this is going to be increasingly 1161 00:41:18,620 --> 00:41:19,999 loud in the years to come. 1162 00:41:20,000 --> 00:41:22,339 As one expert transport expert 1163 00:41:22,340 --> 00:41:24,439 put it, the open spaces that cities 1164 00:41:24,440 --> 00:41:26,539 like to encourage would end 1165 00:41:26,540 --> 00:41:28,549 as the barricades go up. 1166 00:41:28,550 --> 00:41:30,229 That movement would need to be enforced 1167 00:41:30,230 --> 00:41:31,549 with seamless, poor style 1168 00:41:31,550 --> 00:41:33,349 authoritarianism. 1169 00:41:33,350 --> 00:41:35,719 Maybe that's why we're seeing also 1170 00:41:35,720 --> 00:41:38,359 a huge amount of of hostile 1171 00:41:38,360 --> 00:41:40,489 action and human beings being really 1172 00:41:40,490 --> 00:41:42,559 quite cruel to robots in 1173 00:41:42,560 --> 00:41:43,669 self-driving cars. 1174 00:41:43,670 --> 00:41:45,079 This is from Arizona. 1175 00:41:45,080 --> 00:41:47,419 There have been other cases as well, 1176 00:41:47,420 --> 00:41:49,009 but there are other issues we need to 1177 00:41:49,010 --> 00:41:51,109 think about and that is if you want 1178 00:41:51,110 --> 00:41:53,569 to look at these scenarios, 1179 00:41:53,570 --> 00:41:55,939 something like this VIDEO Coming up 1180 00:41:55,940 --> 00:41:57,499 is one of the places where you need to 1181 00:41:57,500 --> 00:42:00,529 worry about self-driving cars going 1182 00:42:00,530 --> 00:42:01,530 a. 1183 00:42:09,220 --> 00:42:10,460 Thank think. 1184 00:42:18,490 --> 00:42:20,639 They got to 1185 00:42:20,640 --> 00:42:22,139 this point. 1186 00:42:24,340 --> 00:42:25,340 He 1187 00:42:27,950 --> 00:42:28,950 will be. 1188 00:42:31,690 --> 00:42:34,089 So as you might have expected, 1189 00:42:34,090 --> 00:42:36,189 that is from the recent 1190 00:42:36,190 --> 00:42:38,259 fires in California, something 1191 00:42:38,260 --> 00:42:40,059 you would think might be present in the 1192 00:42:40,060 --> 00:42:41,379 minds of a lot of people in the 1193 00:42:41,380 --> 00:42:43,539 self-driving car industry, as they're all 1194 00:42:43,540 --> 00:42:45,969 based around Silicon Valley and probably 1195 00:42:45,970 --> 00:42:47,709 encountered problems with fires over the 1196 00:42:47,710 --> 00:42:49,179 last few years. 1197 00:42:49,180 --> 00:42:51,249 The real issue here, then, when we're 1198 00:42:51,250 --> 00:42:53,439 talking about driving is not who 1199 00:42:53,440 --> 00:42:55,629 drives a car, but the fact that we have 1200 00:42:55,630 --> 00:42:58,179 any cars at all. 1201 00:42:58,180 --> 00:42:59,739 And so. 1202 00:42:59,740 --> 00:43:02,169 And the fact that we have 1.4 1203 00:43:02,170 --> 00:43:04,659 billion cars currently on the planet. 1204 00:43:04,660 --> 00:43:06,789 Anything we do with cars is 1205 00:43:06,790 --> 00:43:08,949 going to be unsustainable no matter 1206 00:43:08,950 --> 00:43:11,049 how we change the 1207 00:43:11,050 --> 00:43:12,999 technology that drives them. 1208 00:43:13,000 --> 00:43:15,089 And of course, what we assume are 1209 00:43:15,090 --> 00:43:17,259 what the self-driving car assumes 1210 00:43:17,260 --> 00:43:18,369 is somehow or other. 1211 00:43:18,370 --> 00:43:19,869 It's going to be fine because they're all 1212 00:43:19,870 --> 00:43:21,129 going to be electric. 1213 00:43:21,130 --> 00:43:24,039 Well, this is a lithium plant in Bolivia, 1214 00:43:24,040 --> 00:43:25,389 admittedly, and it's actually quite 1215 00:43:25,390 --> 00:43:26,979 pricey from here. 1216 00:43:26,980 --> 00:43:28,389 But you've got to remember that each of 1217 00:43:28,390 --> 00:43:30,549 these pools of evaporation 1218 00:43:30,550 --> 00:43:32,619 has toxic waste in them, and 1219 00:43:32,620 --> 00:43:34,839 lithium extraction is like any other 1220 00:43:34,840 --> 00:43:36,059 extractive industry. 1221 00:43:36,060 --> 00:43:38,289 It is appallingly destructive 1222 00:43:38,290 --> 00:43:40,449 to our environment and the places 1223 00:43:40,450 --> 00:43:43,239 where it happens. It has a huge cost. 1224 00:43:43,240 --> 00:43:44,409 When you look at lithium, 1225 00:43:45,490 --> 00:43:48,309 there is a vast amount that we require 1226 00:43:48,310 --> 00:43:49,239 currently there. 1227 00:43:49,240 --> 00:43:51,339 If there are 1.4 billion cars in 1228 00:43:51,340 --> 00:43:53,499 the world and we changed them 1229 00:43:53,500 --> 00:43:55,269 to lithium, that's 12 kilograms of 1230 00:43:55,270 --> 00:43:56,229 lithium per car. 1231 00:43:56,230 --> 00:43:57,799 That's the normal amount at the moment 1232 00:43:57,800 --> 00:43:59,559 for, say, a Tesla. 1233 00:43:59,560 --> 00:44:01,839 That's sixteen point eight million tons 1234 00:44:01,840 --> 00:44:02,979 of lithium. 1235 00:44:02,980 --> 00:44:05,259 And yet we have 80 million 1236 00:44:05,260 --> 00:44:07,689 tons of so-called resources 1237 00:44:07,690 --> 00:44:10,179 known quantities, but only 17 1238 00:44:10,180 --> 00:44:12,279 million tons of reserves those that we 1239 00:44:12,280 --> 00:44:13,359 can actually extract. 1240 00:44:13,360 --> 00:44:15,429 In other words, all the lithium we know 1241 00:44:15,430 --> 00:44:17,469 we can extract, we would have to use for 1242 00:44:17,470 --> 00:44:18,699 self-driving cars. 1243 00:44:18,700 --> 00:44:20,109 That means there's nothing left for 1244 00:44:20,110 --> 00:44:21,609 mobile phones. They'll have to go 1245 00:44:21,610 --> 00:44:23,679 clockwork, and that's 10 1246 00:44:23,680 --> 00:44:25,869 times the production of lithium 1247 00:44:25,870 --> 00:44:27,999 that we that we actually produce 1248 00:44:28,000 --> 00:44:29,379 today. 1249 00:44:29,380 --> 00:44:32,169 And of course, lithium is not the only 1250 00:44:32,170 --> 00:44:33,909 element we need to look at cobalt, as 1251 00:44:33,910 --> 00:44:36,039 well as a kilo of cobalt in a lithium 1252 00:44:36,040 --> 00:44:36,969 battery. 1253 00:44:36,970 --> 00:44:39,399 And that comes primarily from the direct 1254 00:44:39,400 --> 00:44:41,529 from Congo, which is at the center 1255 00:44:41,530 --> 00:44:44,319 of some of the world's world's worst 1256 00:44:44,320 --> 00:44:47,259 child slavery situations. 1257 00:44:47,260 --> 00:44:49,569 So again, you've got this real 1258 00:44:49,570 --> 00:44:51,579 problem of locking us further and further 1259 00:44:51,580 --> 00:44:53,379 into an extractive industry, which is 1260 00:44:53,380 --> 00:44:55,659 fundamentally unsustainable. 1261 00:44:55,660 --> 00:44:58,179 And even when you look at the carbon, 1262 00:44:58,180 --> 00:45:00,849 it's not so clear that a lithium 1263 00:45:00,850 --> 00:45:03,339 powered electric vehicle is somehow 1264 00:45:03,340 --> 00:45:05,139 going to be more sustainable than a 1265 00:45:05,140 --> 00:45:07,629 carbon atom in a normal combustion 1266 00:45:07,630 --> 00:45:08,630 engine. 1267 00:45:09,400 --> 00:45:11,169 Researchers in Germany have found, for 1268 00:45:11,170 --> 00:45:13,419 example, that in actual fact, 1269 00:45:13,420 --> 00:45:15,579 over the life cycle of a car that is for 1270 00:45:15,580 --> 00:45:17,679 the manufacture as well 1271 00:45:17,680 --> 00:45:19,359 as the consumption of energy, while it's 1272 00:45:19,360 --> 00:45:21,579 works in service to the 1273 00:45:21,580 --> 00:45:23,979 actual disposal of the car, the carbon 1274 00:45:23,980 --> 00:45:26,229 impact of a of an electric vehicle 1275 00:45:26,230 --> 00:45:28,029 is probably just as much in Germany 1276 00:45:28,030 --> 00:45:30,339 because of the dependance here on fossil 1277 00:45:30,340 --> 00:45:32,229 fuels on coal power. 1278 00:45:32,230 --> 00:45:34,179 Other research has been more optimistic 1279 00:45:34,180 --> 00:45:36,099 about that, and it's found that, for 1280 00:45:36,100 --> 00:45:38,079 example, if you do the whole hype, if you 1281 00:45:38,080 --> 00:45:40,299 look at the whole life cycle, 1282 00:45:40,300 --> 00:45:42,729 a standard average petrol 1283 00:45:42,730 --> 00:45:44,949 car is going to produce something 1284 00:45:44,950 --> 00:45:47,169 like twenty two hundred and fifty grams 1285 00:45:47,170 --> 00:45:49,909 per kilometer, whereas a 1286 00:45:49,910 --> 00:45:53,169 a Nissan Leaf, one of the lightest 1287 00:45:53,170 --> 00:45:55,239 forms of electric car available today, is 1288 00:45:55,240 --> 00:45:57,489 one hundred and forty two grams 1289 00:45:57,490 --> 00:45:59,619 per kilometer. So a lot less, 1290 00:45:59,620 --> 00:46:01,479 admittedly, but it's still significant. 1291 00:46:01,480 --> 00:46:02,589 And of course, this is one of the 1292 00:46:02,590 --> 00:46:04,929 lightest cars, but 1293 00:46:04,930 --> 00:46:06,609 electric vehicles are having another 1294 00:46:06,610 --> 00:46:08,049 issue. 1295 00:46:08,050 --> 00:46:10,119 And so autonomous vehicles are 1296 00:46:10,120 --> 00:46:12,249 having another issue, and that 1297 00:46:12,250 --> 00:46:14,619 is on impacts on public policy today. 1298 00:46:16,120 --> 00:46:18,279 And here we have a situation in Camino, 1299 00:46:18,280 --> 00:46:20,349 in California and in 1300 00:46:20,350 --> 00:46:21,879 Camino in California. 1301 00:46:21,880 --> 00:46:24,789 They try to introduce bus lanes, 1302 00:46:24,790 --> 00:46:25,989 and those bus lanes 1303 00:46:27,790 --> 00:46:29,709 were overrun by people saying that 1304 00:46:29,710 --> 00:46:31,299 they're going to be antiquated and 1305 00:46:31,300 --> 00:46:32,859 they're going to. We need to wait for 1306 00:46:32,860 --> 00:46:34,179 self-driving cars. 1307 00:46:34,180 --> 00:46:36,189 That was in 2014. 1308 00:46:36,190 --> 00:46:37,449 They're still waiting for them. 1309 00:46:37,450 --> 00:46:39,249 They still don't have any bus lanes. 1310 00:46:39,250 --> 00:46:41,319 The same thing happened in Detroit 1311 00:46:41,320 --> 00:46:43,509 and and in Detroit, they 1312 00:46:43,510 --> 00:46:45,849 had a referendum 1313 00:46:45,850 --> 00:46:47,049 which was overruled. 1314 00:46:47,050 --> 00:46:48,759 And here you have and this is at the 1315 00:46:48,760 --> 00:46:49,989 heart of it, really. 1316 00:46:49,990 --> 00:46:52,269 This is from one of the leading venture 1317 00:46:52,270 --> 00:46:53,379 capital companies. 1318 00:46:53,380 --> 00:46:55,269 Don't build a light railway system. 1319 00:46:55,270 --> 00:46:57,429 Please, please, please don't, says 1320 00:46:57,430 --> 00:46:59,409 this person from Andreessen Horowitz. 1321 00:47:00,580 --> 00:47:02,259 We don't understand the economics of 1322 00:47:02,260 --> 00:47:03,849 self-driving cars because we haven't 1323 00:47:03,850 --> 00:47:05,139 experienced them yet. 1324 00:47:05,140 --> 00:47:07,809 Let's see how it plays out. 1325 00:47:07,810 --> 00:47:09,939 And here you can see even send a 1326 00:47:09,940 --> 00:47:10,959 picture here. 1327 00:47:10,960 --> 00:47:13,029 Talking about this is in the last few 1328 00:47:13,030 --> 00:47:15,219 weeks how Google 1329 00:47:15,220 --> 00:47:17,589 is helping with climate change, 1330 00:47:17,590 --> 00:47:18,549 how it's using A.I. 1331 00:47:18,550 --> 00:47:20,139 to address carbon impact. 1332 00:47:20,140 --> 00:47:22,539 Nothing here about the fact that plowing 1333 00:47:22,540 --> 00:47:24,609 tens of billions of dollars into a 1334 00:47:24,610 --> 00:47:26,889 technology which is going to take 1335 00:47:26,890 --> 00:47:28,779 us towards climate change. 1336 00:47:28,780 --> 00:47:30,010 So the reality is we got. 1337 00:47:31,510 --> 00:47:33,999 One option for safe 1338 00:47:34,000 --> 00:47:36,069 cities, and that is to take cars out 1339 00:47:36,070 --> 00:47:37,899 to them altogether. 1340 00:47:37,900 --> 00:47:40,029 And what we need to consider is 1341 00:47:40,030 --> 00:47:41,949 why we are going in the self-driving car 1342 00:47:41,950 --> 00:47:43,780 route in the first instance. 1343 00:47:45,730 --> 00:47:46,749 Why are we not? 1344 00:47:46,750 --> 00:47:48,969 It's like other technologies, 1345 00:47:48,970 --> 00:47:51,489 which the tech company 1346 00:47:51,490 --> 00:47:52,490 to tend to 1347 00:47:53,620 --> 00:47:55,509 tend to push on us, like going to Mars, 1348 00:47:55,510 --> 00:47:56,799 like cryogenics. 1349 00:47:56,800 --> 00:47:58,899 These are things that belong in a 1350 00:47:58,900 --> 00:48:00,459 teenager's bedroom. 1351 00:48:00,460 --> 00:48:02,619 And so with 100 billion dollars, 1352 00:48:02,620 --> 00:48:03,819 we could do a hell of a lot more. 1353 00:48:03,820 --> 00:48:06,789 We could build cycle superhighways. 1354 00:48:06,790 --> 00:48:08,979 We could spend 10 years of free 1355 00:48:08,980 --> 00:48:10,269 public transport. 1356 00:48:10,270 --> 00:48:12,519 So this is the real future of 1357 00:48:12,520 --> 00:48:14,979 the car. Autonomous or not autonomous? 1358 00:48:14,980 --> 00:48:17,199 This is the way cars can contribute 1359 00:48:17,200 --> 00:48:18,999 to a sustainable future. 1360 00:48:19,000 --> 00:48:20,000 Thank you. 1361 00:48:21,620 --> 00:48:23,989 Given rights, do you have time 1362 00:48:23,990 --> 00:48:26,089 for a few questions because there 1363 00:48:26,090 --> 00:48:27,109 are a couple of them? 1364 00:48:28,370 --> 00:48:30,079 So the first one is is it more a 1365 00:48:30,080 --> 00:48:32,209 liability issue or a technical 1366 00:48:32,210 --> 00:48:34,449 issue that no alta nos vehicles 1367 00:48:34,450 --> 00:48:36,139 on the street yet 1368 00:48:36,140 --> 00:48:37,879 that no autonomous vehicle on the street? 1369 00:48:37,880 --> 00:48:40,129 Well, first of all, we have 1370 00:48:40,130 --> 00:48:41,779 we have we don't. Well, it's 1371 00:48:41,780 --> 00:48:43,459 technologically, if I may. 1372 00:48:43,460 --> 00:48:45,919 Yeah. But I mean, not not 1373 00:48:45,920 --> 00:48:48,049 yet possible due to the 1374 00:48:48,050 --> 00:48:49,699 fact that it's always restricted to the 1375 00:48:49,700 --> 00:48:51,599 geo fenced areas where the maps are 1376 00:48:51,600 --> 00:48:53,689 pretty mapped prebuilt 1377 00:48:53,690 --> 00:48:56,059 or where the weather can't 1378 00:48:56,060 --> 00:48:58,249 harm you. So truly autonomous, truly 1379 00:48:58,250 --> 00:49:00,019 autonomous. Or let's call it, 1380 00:49:00,020 --> 00:49:02,239 self-driving cars are not out 1381 00:49:02,240 --> 00:49:04,369 there for the technological reason. 1382 00:49:04,370 --> 00:49:06,289 But we can be very happy about that 1383 00:49:06,290 --> 00:49:08,269 because the liability, even if they're 1384 00:49:08,270 --> 00:49:10,369 trying to grasp at it now, is 1385 00:49:10,370 --> 00:49:12,799 not at all done and it doesn't to doesn't 1386 00:49:12,800 --> 00:49:14,899 look like they're going to develop it in 1387 00:49:14,900 --> 00:49:17,869 a way that we humans can just lean back. 1388 00:49:17,870 --> 00:49:19,669 We rest of our humans. 1389 00:49:19,670 --> 00:49:21,709 So that not necessarily depend where the 1390 00:49:21,710 --> 00:49:23,509 millions go right away 1391 00:49:23,510 --> 00:49:25,159 and the liability problems are so 1392 00:49:25,160 --> 00:49:26,599 intractable, it's hard to see where the 1393 00:49:26,600 --> 00:49:28,639 solutions really lie to under our current 1394 00:49:28,640 --> 00:49:29,719 regulation systems. 1395 00:49:31,160 --> 00:49:32,749 So both, thank you. 1396 00:49:32,750 --> 00:49:35,089 Next question is, are regulators 1397 00:49:35,090 --> 00:49:37,129 or insurers worried about the danger 1398 00:49:37,130 --> 00:49:39,229 inherent in human passengers who 1399 00:49:39,230 --> 00:49:41,299 aren't paying attention only needing 1400 00:49:41,300 --> 00:49:43,489 to take control in extreme conditions 1401 00:49:43,490 --> 00:49:45,709 seems far more dangerous than 1402 00:49:45,710 --> 00:49:47,989 it requiring constant attention? 1403 00:49:49,130 --> 00:49:50,479 Nice one 1404 00:49:50,480 --> 00:49:51,589 for you. Yeah, sure. 1405 00:49:51,590 --> 00:49:52,939 I mean, I mean, this is absolutely 1406 00:49:52,940 --> 00:49:54,829 correct. I mean, when you look, this is a 1407 00:49:54,830 --> 00:49:56,239 real problem with level three. 1408 00:49:56,240 --> 00:49:58,249 We went through to different levels. 1409 00:49:58,250 --> 00:50:00,409 Level three autonomy, which 1410 00:50:00,410 --> 00:50:02,299 seems technically most achievable. 1411 00:50:02,300 --> 00:50:04,099 That's what Tesla is aiming for. 1412 00:50:04,100 --> 00:50:06,289 The real problem there is that 1413 00:50:06,290 --> 00:50:08,389 it's very hard to get 1414 00:50:08,390 --> 00:50:10,549 drivers to pay attention if 1415 00:50:10,550 --> 00:50:12,229 they're not actually driving. 1416 00:50:12,230 --> 00:50:14,119 And research has shown time and time 1417 00:50:14,120 --> 00:50:16,339 again as a result, 1418 00:50:16,340 --> 00:50:18,169 the reaction times when something does go 1419 00:50:18,170 --> 00:50:19,969 wrong and that much slower. 1420 00:50:19,970 --> 00:50:21,979 And this is a massive, massive problem, 1421 00:50:21,980 --> 00:50:23,719 and I think it was highlighted in the 1422 00:50:23,720 --> 00:50:25,339 scale that you showed. 1423 00:50:25,340 --> 00:50:27,469 But this is a big reason why, 1424 00:50:27,470 --> 00:50:29,779 for example, Waymo, other 1425 00:50:29,780 --> 00:50:31,429 self-driving cars are going straight to 1426 00:50:31,430 --> 00:50:32,479 level five. 1427 00:50:32,480 --> 00:50:34,609 They actually find level five 1428 00:50:34,610 --> 00:50:36,679 to be an easier technical challenge than 1429 00:50:36,680 --> 00:50:38,509 trying to address this human interaction 1430 00:50:38,510 --> 00:50:40,249 problem we did before. 1431 00:50:40,250 --> 00:50:41,599 They couldn't, it couldn't erase the 1432 00:50:41,600 --> 00:50:43,759 driver. And this question has another 1433 00:50:43,760 --> 00:50:45,469 beautiful connotation. 1434 00:50:45,470 --> 00:50:46,909 It goes into the direction of what they 1435 00:50:46,910 --> 00:50:48,709 are. What are they interested in? 1436 00:50:48,710 --> 00:50:50,459 Are they interested in saving the call 1437 00:50:50,460 --> 00:50:52,099 the human or roughly something like that? 1438 00:50:52,100 --> 00:50:54,709 If I heard that correctly, and it's 1439 00:50:54,710 --> 00:50:56,869 that's a nice one, because that's exactly 1440 00:50:56,870 --> 00:50:58,669 what what this is so interesting about in 1441 00:50:58,670 --> 00:51:00,229 this turnaround that they made. 1442 00:51:00,230 --> 00:51:01,819 First of all, they're trying to skip 1443 00:51:01,820 --> 00:51:04,159 three and then they're trying to make 1444 00:51:04,160 --> 00:51:06,469 the drivers see through so they can 1445 00:51:06,470 --> 00:51:08,719 make three and then 1446 00:51:08,720 --> 00:51:09,979 four and five. 1447 00:51:09,980 --> 00:51:11,629 And it's so weird because, as I said, 1448 00:51:11,630 --> 00:51:13,699 it's never the everyday or I hope I've 1449 00:51:13,700 --> 00:51:15,469 said, it's never the everyday situations 1450 00:51:15,470 --> 00:51:16,639 that are a problem with the autonomous 1451 00:51:16,640 --> 00:51:19,280 driving. It's always the interesting 1452 00:51:20,840 --> 00:51:23,059 out of order situations that then 1453 00:51:23,060 --> 00:51:24,829 the system is supposed to learn from the 1454 00:51:24,830 --> 00:51:26,629 driver in these situations where the 1455 00:51:26,630 --> 00:51:28,219 system wouldn't have ever decided like 1456 00:51:28,220 --> 00:51:30,019 that, but the driver did. 1457 00:51:30,020 --> 00:51:32,509 And that is just 1458 00:51:32,510 --> 00:51:35,059 a hundred and sixty degrees around 1459 00:51:35,060 --> 00:51:37,759 and then another 180 more 1460 00:51:37,760 --> 00:51:40,069 to go into the into the other direction 1461 00:51:40,070 --> 00:51:41,929 that you've been been going in for. 1462 00:51:41,930 --> 00:51:44,779 First, you want to erase the error 1463 00:51:44,780 --> 00:51:46,999 source human and then you need 1464 00:51:47,000 --> 00:51:49,099 the human not to do the worst 1465 00:51:49,100 --> 00:51:51,259 errors ever, which is just 1466 00:51:51,260 --> 00:51:52,610 a complete turnaround. 1467 00:51:54,050 --> 00:51:56,179 OK. There are quite a lot 1468 00:51:56,180 --> 00:51:58,789 of questions here, and I think you're 1469 00:51:58,790 --> 00:52:00,139 right. I mean, we would love that. 1470 00:52:00,140 --> 00:52:02,269 I'm afraid we won't have the time 1471 00:52:02,270 --> 00:52:04,549 to ask them, or I'll take 1472 00:52:04,550 --> 00:52:06,679 one. But before 1473 00:52:06,680 --> 00:52:08,389 there's already the question, where can 1474 00:52:08,390 --> 00:52:10,369 we discuss this further? 1475 00:52:10,370 --> 00:52:13,039 Well, tonight I would recommend meeting 1476 00:52:13,040 --> 00:52:14,090 the pets because it's 1477 00:52:15,290 --> 00:52:17,449 I think Coco 1478 00:52:17,450 --> 00:52:19,129 just and this is why I'm so glad that you 1479 00:52:19,130 --> 00:52:21,349 mentioned it beforehand is basically 1480 00:52:21,350 --> 00:52:23,569 a spending possibility 1481 00:52:23,570 --> 00:52:25,729 to do colloquium and conferences 1482 00:52:25,730 --> 00:52:26,929 on all the kinds of topics that you're 1483 00:52:26,930 --> 00:52:28,939 interested in. And I'm going to do that 1484 00:52:28,940 --> 00:52:31,039 talk in our different version 1485 00:52:31,040 --> 00:52:33,169 again at Coco, and I would be delighted 1486 00:52:33,170 --> 00:52:35,509 if you guys came and and brought 1487 00:52:35,510 --> 00:52:37,549 in your expertize, and I'm sure Alison is 1488 00:52:37,550 --> 00:52:38,599 going to be there. 1489 00:52:38,600 --> 00:52:41,089 And apart from that 1490 00:52:41,090 --> 00:52:43,699 email, and 1491 00:52:43,700 --> 00:52:46,009 yeah, I mean, meet us follow up 1492 00:52:46,010 --> 00:52:47,059 on chat afterwards. 1493 00:52:47,060 --> 00:52:48,019 Yeah, yeah. 1494 00:52:48,020 --> 00:52:49,999 OK, then. Thanks. 1495 00:52:50,000 --> 00:52:52,099 Thank you for me. And also, there are 1496 00:52:52,100 --> 00:52:54,169 a lot of things and that's the best 1497 00:52:54,170 --> 00:52:56,269 talk we heard in the chatter here in the 1498 00:52:56,270 --> 00:52:58,729 Petzel and a lot of questions. 1499 00:52:58,730 --> 00:53:00,829 So I give you the link afterwards and 1500 00:53:00,830 --> 00:53:03,439 you can chat it out with them 1501 00:53:03,440 --> 00:53:05,239 over there. So thank you for the talk. 1502 00:53:05,240 --> 00:53:06,240 Thank you. 1503 00:53:06,960 --> 00:53:07,960 Thank you so much. 1504 00:53:16,970 --> 00:53:18,530 You.