0 00:00:00,000 --> 00:00:30,000 Dear viewer, these subtitles were generated by a machine via the service Trint and therefore are (very) buggy. If you are capable, please help us to create good quality subtitles: https://c3subtitles.de/talk/752 Thanks! 1 00:00:15,030 --> 00:00:17,219 OK, welcome, everybody, 2 00:00:17,220 --> 00:00:19,769 to our next talk, 3 00:00:19,770 --> 00:00:22,949 which is called Ethics in a Data Society, 4 00:00:22,950 --> 00:00:25,259 and our next speaker 5 00:00:25,260 --> 00:00:26,549 here, Maya Garnette. 6 00:00:26,550 --> 00:00:28,889 She is director of applied research 7 00:00:28,890 --> 00:00:30,479 at Tactical Tech Collective. 8 00:00:30,480 --> 00:00:32,909 You might know that as she calls herself 9 00:00:32,910 --> 00:00:34,799 a research practitioner, writer and 10 00:00:34,800 --> 00:00:35,909 activist. 11 00:00:35,910 --> 00:00:37,949 And she has a master's degree in media 12 00:00:37,950 --> 00:00:39,809 and culture studies, but also in applied 13 00:00:39,810 --> 00:00:42,449 technology was a technology 14 00:00:42,450 --> 00:00:44,699 psychology sorry, 15 00:00:44,700 --> 00:00:46,769 which is an interesting mix, I think. 16 00:00:46,770 --> 00:00:48,869 And her issues are the issues she's 17 00:00:48,870 --> 00:00:51,179 working on technology and activism, 18 00:00:51,180 --> 00:00:53,069 human rights, defense with a focus on 19 00:00:53,070 --> 00:00:54,629 gender and technology. 20 00:00:54,630 --> 00:00:56,699 But today, because 21 00:00:56,700 --> 00:00:58,859 Myers, also a doctoral candidate 22 00:00:58,860 --> 00:01:01,109 at Loyola University, which is not far 23 00:01:01,110 --> 00:01:03,629 away from here at Loon Brook 24 00:01:03,630 --> 00:01:06,179 University, she 25 00:01:06,180 --> 00:01:08,889 has her doctoral project, 26 00:01:08,890 --> 00:01:11,069 Project on Ethics and the context of 27 00:01:11,070 --> 00:01:13,529 driverless cars, which is 28 00:01:13,530 --> 00:01:15,389 the topic of our talk today. 29 00:01:15,390 --> 00:01:17,489 And in her lecture, Maya will 30 00:01:17,490 --> 00:01:20,039 discuss discuss her ongoing research 31 00:01:20,040 --> 00:01:22,529 on the emergence of this new technology 32 00:01:22,530 --> 00:01:25,079 and how it shapes our understanding 33 00:01:25,080 --> 00:01:26,909 and the meaning of ethics in a data 34 00:01:26,910 --> 00:01:27,989 society. 35 00:01:27,990 --> 00:01:30,149 Give a warm welcome to Maya and 36 00:01:30,150 --> 00:01:31,170 enjoy the talk. 37 00:01:37,660 --> 00:01:39,139 Thank you. Thanks a lot. 38 00:01:39,140 --> 00:01:41,199 It's then thank you very much to 33 C 39 00:01:41,200 --> 00:01:43,629 three for giving me this opportunity 40 00:01:43,630 --> 00:01:45,069 to talk about my work. 41 00:01:45,070 --> 00:01:47,199 I've actually talked about this 42 00:01:47,200 --> 00:01:49,269 topic a couple of times this past year. 43 00:01:49,270 --> 00:01:51,219 However, this is a brand new talk because 44 00:01:51,220 --> 00:01:52,629 everything is recorded these days, so you 45 00:01:52,630 --> 00:01:54,039 can't recycle at all. 46 00:01:54,040 --> 00:01:56,199 However, it's weird 47 00:01:56,200 --> 00:01:58,239 to actually feel slightly nervous, even 48 00:01:58,240 --> 00:01:59,469 though it's something that you know very 49 00:01:59,470 --> 00:02:01,089 well. And, you know, like they say, 50 00:02:01,090 --> 00:02:02,229 nobody knows your Ph.D. 51 00:02:02,230 --> 00:02:03,230 topic like you do. 52 00:02:04,360 --> 00:02:05,799 Maybe that's because of all of the 53 00:02:05,800 --> 00:02:07,929 numbers of really wonderful people who I 54 00:02:07,930 --> 00:02:09,249 know and love and respect in the 55 00:02:09,250 --> 00:02:10,448 audience. Maybe a little bit of 56 00:02:10,449 --> 00:02:11,979 nervousness coming from there. 57 00:02:11,980 --> 00:02:14,049 It's always hard to talk to people who 58 00:02:14,050 --> 00:02:15,220 you think are smarter than you 59 00:02:16,720 --> 00:02:18,789 anyway. So so that's 60 00:02:18,790 --> 00:02:20,919 why it's a it's a nice way to kind 61 00:02:20,920 --> 00:02:22,549 of title my talk. 62 00:02:22,550 --> 00:02:24,549 Also change the title entanglements. 63 00:02:24,550 --> 00:02:26,589 It's not ethics in the data society. 64 00:02:26,590 --> 00:02:28,569 I just did that so that 32 C three would 65 00:02:28,570 --> 00:02:30,099 accept the talk. 66 00:02:30,100 --> 00:02:31,839 This was always supposed to be the title. 67 00:02:33,310 --> 00:02:35,379 So when I 68 00:02:35,380 --> 00:02:37,899 started thinking about ethics, 69 00:02:37,900 --> 00:02:39,129 I actually started in a very different 70 00:02:39,130 --> 00:02:40,539 place. I wasn't really that interested in 71 00:02:40,540 --> 00:02:41,739 driverless cars. 72 00:02:41,740 --> 00:02:43,119 I was interested in this thing that I 73 00:02:43,120 --> 00:02:44,979 discovered called an ethical app, which 74 00:02:44,980 --> 00:02:46,809 is actually a shopping app about making 75 00:02:46,810 --> 00:02:48,579 the right kinds of consumer choices. 76 00:02:48,580 --> 00:02:51,159 And it struck me as very odd that 77 00:02:51,160 --> 00:02:53,319 people thought that ethics could come out 78 00:02:53,320 --> 00:02:54,249 of an app. 79 00:02:54,250 --> 00:02:56,439 And I'm actually not a big like 80 00:02:56,440 --> 00:02:58,119 I'm not really into ethics. 81 00:02:58,120 --> 00:02:59,589 If I can get through the next four years 82 00:02:59,590 --> 00:03:01,419 without reading Aristotle, I'd be really 83 00:03:01,420 --> 00:03:02,420 happy. 84 00:03:03,280 --> 00:03:05,349 But I was curious about what it is 85 00:03:05,350 --> 00:03:07,479 about technology that makes people feel 86 00:03:07,480 --> 00:03:09,999 like it will give us answers 87 00:03:10,000 --> 00:03:11,379 to these really complex social and 88 00:03:11,380 --> 00:03:13,449 political situations problems 89 00:03:13,450 --> 00:03:14,769 that we face. And that's where I come 90 00:03:14,770 --> 00:03:16,749 from in trying to sort of understand the 91 00:03:16,750 --> 00:03:18,399 workings of society and technology. 92 00:03:19,990 --> 00:03:22,059 So when we are talking about ethics, 93 00:03:22,060 --> 00:03:23,139 what are we really talking about? 94 00:03:23,140 --> 00:03:24,699 That's the first thing I kind of had to 95 00:03:24,700 --> 00:03:26,829 engage with, that we're actually having 96 00:03:26,830 --> 00:03:28,719 a discussion about a technology that's 97 00:03:28,720 --> 00:03:29,979 emerging in society. 98 00:03:29,980 --> 00:03:32,049 And driverless 99 00:03:32,050 --> 00:03:34,209 cars are kind of at the peak of what 100 00:03:34,210 --> 00:03:36,219 Gartner calls the hype cycle. 101 00:03:36,220 --> 00:03:38,559 So every day there's a story 102 00:03:38,560 --> 00:03:40,209 about driverless cars. 103 00:03:40,210 --> 00:03:42,069 A lot of this technology, which I believe 104 00:03:42,070 --> 00:03:44,379 is not very robust, not very well tested, 105 00:03:44,380 --> 00:03:46,689 is actually being developed in the open. 106 00:03:46,690 --> 00:03:48,279 So it's very difficult for the public to 107 00:03:48,280 --> 00:03:50,589 also develop a sense of confidence 108 00:03:50,590 --> 00:03:51,879 in what this technology means. 109 00:03:51,880 --> 00:03:54,009 And things like cars are 110 00:03:54,010 --> 00:03:56,139 so banal and so mundane, but they are 111 00:03:56,140 --> 00:03:58,299 very intimate machines and 112 00:03:58,300 --> 00:03:59,949 we have a long history of relating to 113 00:03:59,950 --> 00:04:02,049 them. So when we talk about ethics, what 114 00:04:02,050 --> 00:04:04,149 are we really talking about? 115 00:04:04,150 --> 00:04:05,709 And I go through some of these things in 116 00:04:05,710 --> 00:04:07,209 the course of my talk today, I will also 117 00:04:07,210 --> 00:04:09,039 skip over some things in the interest of 118 00:04:09,040 --> 00:04:10,569 time. But I think that there's space for 119 00:04:10,570 --> 00:04:12,939 questions as well at the end. 120 00:04:12,940 --> 00:04:14,799 So at one level, we're talking about 121 00:04:14,800 --> 00:04:17,199 things that can be programed into 122 00:04:17,200 --> 00:04:19,359 software, moral decisions 123 00:04:19,360 --> 00:04:22,149 that will result in certain outcomes 124 00:04:22,150 --> 00:04:24,309 that we Tomaz, ethics when we 125 00:04:24,310 --> 00:04:26,439 say ethics and driverless cars, ethics 126 00:04:26,440 --> 00:04:28,239 and autonomous technologies more broadly, 127 00:04:28,240 --> 00:04:29,949 we're also thinking about who's 128 00:04:29,950 --> 00:04:32,409 responsible when something goes wrong, 129 00:04:32,410 --> 00:04:34,119 but also who is going to pay when 130 00:04:34,120 --> 00:04:35,319 something goes wrong. And I think the 131 00:04:35,320 --> 00:04:37,569 financialization of risk and the rise 132 00:04:37,570 --> 00:04:39,039 of the insurance industry is something 133 00:04:39,040 --> 00:04:40,359 I'm personally quite interested in. 134 00:04:40,360 --> 00:04:42,069 And I'll talk a little bit about that. 135 00:04:43,180 --> 00:04:44,889 But the most interesting thing that I 136 00:04:44,890 --> 00:04:47,139 think I discovered 137 00:04:47,140 --> 00:04:48,519 in the course of doing this work is my 138 00:04:48,520 --> 00:04:50,859 own interest in science fiction and 139 00:04:50,860 --> 00:04:52,569 how we relate to machines. 140 00:04:52,570 --> 00:04:53,709 And I think that a lot of our 141 00:04:53,710 --> 00:04:55,749 understanding around new technologies 142 00:04:55,750 --> 00:04:57,819 today has to do with the fact that, 143 00:04:57,820 --> 00:04:59,709 you know, we grew up watching things like 144 00:04:59,710 --> 00:05:01,809 Star Wars or we watched Ex Machina 145 00:05:01,810 --> 00:05:04,089 or, you know, it's cinema and literature 146 00:05:04,090 --> 00:05:06,159 and fantasy are such a big part of our 147 00:05:06,160 --> 00:05:08,169 relationship with the technologies that 148 00:05:08,170 --> 00:05:09,429 we create and technologies that we 149 00:05:09,430 --> 00:05:10,779 inhabit. 150 00:05:10,780 --> 00:05:12,489 So I'm also going to say, well, I'm not 151 00:05:12,490 --> 00:05:13,989 going to talk about that so much in this 152 00:05:13,990 --> 00:05:15,730 talk, but maybe maybe in some other. 153 00:05:19,630 --> 00:05:21,819 So we're seeing the combination of all 154 00:05:21,820 --> 00:05:23,799 of these multiple conversations going on 155 00:05:23,800 --> 00:05:24,800 at the same time, 156 00:05:25,870 --> 00:05:28,059 and I'll start by talking about ethics 157 00:05:28,060 --> 00:05:29,559 as something that we think can be 158 00:05:29,560 --> 00:05:30,339 programed. 159 00:05:30,340 --> 00:05:32,529 And there's some really interesting 160 00:05:32,530 --> 00:05:34,539 new projects that kind of think that, you 161 00:05:34,540 --> 00:05:36,189 know, ethics can be an outcome of a 162 00:05:36,190 --> 00:05:37,190 software program. 163 00:05:38,930 --> 00:05:40,789 I usually start by talking about this 164 00:05:40,790 --> 00:05:42,679 thing called the trolley problem, which 165 00:05:42,680 --> 00:05:44,479 has become very popular recently thanks 166 00:05:44,480 --> 00:05:47,179 to Google's self-driving car project. 167 00:05:47,180 --> 00:05:49,279 Google recently renamed it 168 00:05:49,280 --> 00:05:51,679 self-driving project, car driving, 169 00:05:51,680 --> 00:05:53,690 self-driving car project Wimoweh. 170 00:05:54,710 --> 00:05:56,269 And they popularized this thing called 171 00:05:56,270 --> 00:05:57,709 the Trolley Problem, which comes from the 172 00:05:57,710 --> 00:05:59,719 1960s, and a philosopher called Philip 173 00:05:59,720 --> 00:06:01,489 Barfoot, who actually came up with the 174 00:06:01,490 --> 00:06:03,049 trolley problem in order to think through 175 00:06:03,050 --> 00:06:04,519 the permissibility of abortion. 176 00:06:06,620 --> 00:06:08,779 But and so I usually kind of give this 177 00:06:08,780 --> 00:06:10,039 big description of what the trolley 178 00:06:10,040 --> 00:06:12,529 problem is. It is kind of jargon 179 00:06:12,530 --> 00:06:13,579 heavy. 180 00:06:13,580 --> 00:06:15,649 But a good thing is that a team out 181 00:06:15,650 --> 00:06:17,599 of MIT developed something called the 182 00:06:17,600 --> 00:06:19,669 Moral Machine Project, which is based 183 00:06:19,670 --> 00:06:21,559 on this classical thought experiment 184 00:06:21,560 --> 00:06:22,729 called the Trolley Project. 185 00:06:22,730 --> 00:06:25,219 So talk about the modern machines 186 00:06:25,220 --> 00:06:27,829 project, the ethics bots and 187 00:06:27,830 --> 00:06:30,379 the application of another quite old 400 188 00:06:30,380 --> 00:06:31,999 year old problem called the Pascaline 189 00:06:32,000 --> 00:06:34,069 Wager. I will try not to make this. 190 00:06:34,070 --> 00:06:35,479 I will try to make this funny and not too 191 00:06:35,480 --> 00:06:36,979 complicated. I mean, like even I don't 192 00:06:36,980 --> 00:06:37,980 get it sometimes. 193 00:06:38,960 --> 00:06:40,729 So modern machines. 194 00:06:40,730 --> 00:06:42,769 So you can look at this modern machine at 195 00:06:42,770 --> 00:06:44,209 MIT dot edu. 196 00:06:44,210 --> 00:06:45,949 It's based on the trolley problem, like I 197 00:06:45,950 --> 00:06:47,839 said. And the idea in the trolley problem 198 00:06:47,840 --> 00:06:50,239 is a car is going down 199 00:06:50,240 --> 00:06:52,309 a trolley, a train trolley is 200 00:06:52,310 --> 00:06:55,219 going down a track, the brakes fail 201 00:06:55,220 --> 00:06:57,799 and the trolley is not going to stop. 202 00:06:57,800 --> 00:07:00,259 But there is a fork in the track 203 00:07:00,260 --> 00:07:01,519 and the trolley has a lever. 204 00:07:01,520 --> 00:07:03,259 So you can either make the trolley go 205 00:07:03,260 --> 00:07:05,059 left or you can make the trolley go. 206 00:07:05,060 --> 00:07:07,129 Right now, the thing is on the left, 207 00:07:07,130 --> 00:07:09,229 there are five people working on the 208 00:07:09,230 --> 00:07:10,669 track and on the right there's one 209 00:07:10,670 --> 00:07:12,739 person. So the decision you have to make 210 00:07:12,740 --> 00:07:14,359 is, is it better to kill five people or 211 00:07:14,360 --> 00:07:16,039 better to kill one person? 212 00:07:16,040 --> 00:07:18,139 This is absolutely impossible, as 213 00:07:18,140 --> 00:07:19,369 you know, to resolve. 214 00:07:19,370 --> 00:07:20,839 But this is what the trolley problem 215 00:07:20,840 --> 00:07:21,840 does. 216 00:07:22,490 --> 00:07:24,709 So you have to decide who you're going 217 00:07:24,710 --> 00:07:26,449 to kill, but you also have to come up 218 00:07:26,450 --> 00:07:27,799 with the rationale for how you're going 219 00:07:27,800 --> 00:07:29,959 to kill them. So this is sort of 220 00:07:29,960 --> 00:07:32,779 this classic tension between utilitarian 221 00:07:32,780 --> 00:07:35,119 ethics and Kantian deontological 222 00:07:35,120 --> 00:07:37,549 ethics, which simply means the outcome 223 00:07:37,550 --> 00:07:40,609 matters. Does the outcome matter or 224 00:07:40,610 --> 00:07:42,739 does how the rationale for why you 225 00:07:42,740 --> 00:07:45,229 want to kill either one or five matter? 226 00:07:45,230 --> 00:07:46,789 And again, I mean, this is impossible. 227 00:07:46,790 --> 00:07:48,139 It's really difficult to do. 228 00:07:48,140 --> 00:07:50,179 But the modern machine project does does 229 00:07:50,180 --> 00:07:52,369 this. So they set up a scenario and these 230 00:07:52,370 --> 00:07:53,569 are just like screen grabs from the 231 00:07:53,570 --> 00:07:55,579 website, which you can look at. 232 00:07:55,580 --> 00:07:57,919 So a car is going down, the brakes 233 00:07:57,920 --> 00:08:00,739 fail and the people crossing 234 00:08:00,740 --> 00:08:01,879 are in this scenario. 235 00:08:01,880 --> 00:08:03,949 You have there's a thief, there's 236 00:08:03,950 --> 00:08:05,809 an old lady, there's a baby in a pram, 237 00:08:05,810 --> 00:08:07,309 there's a cat. 238 00:08:07,310 --> 00:08:09,409 And you have to decide who 239 00:08:09,410 --> 00:08:10,519 is going to be saved. 240 00:08:10,520 --> 00:08:11,929 There are little skull and crossbones 241 00:08:11,930 --> 00:08:13,699 signs to see who will be who will be 242 00:08:13,700 --> 00:08:15,589 killed and who will not be killed in this 243 00:08:15,590 --> 00:08:17,660 scenario. There are others as well. 244 00:08:19,220 --> 00:08:21,049 In this one, it's like five dogs versus 245 00:08:21,050 --> 00:08:22,050 five cats. 246 00:08:24,430 --> 00:08:25,430 And. 247 00:08:26,160 --> 00:08:28,139 This is another scenario, so this is part 248 00:08:28,140 --> 00:08:29,639 of a research project that the scholars 249 00:08:29,640 --> 00:08:31,499 at MIT are doing, so you browse through 250 00:08:31,500 --> 00:08:33,689 13 scenarios and you decide who 251 00:08:33,690 --> 00:08:35,699 you're going to sacrifice in this in this 252 00:08:35,700 --> 00:08:36,700 hypothetical test. 253 00:08:37,799 --> 00:08:39,599 And then you can go on to fill in 254 00:08:39,600 --> 00:08:41,699 details, if you wish, about who 255 00:08:41,700 --> 00:08:43,529 you are and your opinions, why you made 256 00:08:43,530 --> 00:08:44,529 certain choices. 257 00:08:44,530 --> 00:08:46,649 This is all in the interest of research. 258 00:08:46,650 --> 00:08:48,539 You can also choose not to give your data 259 00:08:48,540 --> 00:08:49,949 to MIT. 260 00:08:49,950 --> 00:08:51,719 And I did the test. 261 00:08:51,720 --> 00:08:53,129 I've actually done it a number of times. 262 00:08:53,130 --> 00:08:54,719 And I have to say a couple of times I 263 00:08:54,720 --> 00:08:56,849 just kind of randomly I mean, 264 00:08:56,850 --> 00:08:58,319 whatever, I just kind of like hit any 265 00:08:58,320 --> 00:09:00,569 option. So, you know, the test taker 266 00:09:00,570 --> 00:09:02,819 wanted to know, like, why did I prefer, 267 00:09:02,820 --> 00:09:04,409 you know, like younger people to older 268 00:09:04,410 --> 00:09:06,149 people or thieves to like, you know, 269 00:09:06,150 --> 00:09:07,679 athletes. 270 00:09:07,680 --> 00:09:09,809 And this is how you kind of develop this 271 00:09:09,810 --> 00:09:11,579 model scenario where you have to decide 272 00:09:11,580 --> 00:09:12,449 who's more important. 273 00:09:12,450 --> 00:09:14,519 And the idea is the driverless car 274 00:09:14,520 --> 00:09:16,349 software has to make this choice, which, 275 00:09:16,350 --> 00:09:17,759 you know, even the best of us cannot make 276 00:09:17,760 --> 00:09:18,839 a choice. 277 00:09:18,840 --> 00:09:20,969 I found this quote two days ago from one 278 00:09:20,970 --> 00:09:23,369 of the French philosophers who behind 279 00:09:23,370 --> 00:09:24,370 this project. 280 00:09:25,500 --> 00:09:27,089 And he's actually kind of it's 281 00:09:27,090 --> 00:09:28,679 interesting that he would say this, that 282 00:09:28,680 --> 00:09:30,599 at this point we are looking into various 283 00:09:30,600 --> 00:09:32,819 forms of shared control, which means 284 00:09:32,820 --> 00:09:34,259 that any accident is going to have a 285 00:09:34,260 --> 00:09:35,969 complicated story about the exact 286 00:09:35,970 --> 00:09:37,649 sequence of decisions. 287 00:09:37,650 --> 00:09:39,899 So it's not easy 288 00:09:39,900 --> 00:09:42,119 to program these things even in this age 289 00:09:42,120 --> 00:09:43,319 of machine learning. 290 00:09:43,320 --> 00:09:44,849 I don't think it's entirely easy to 291 00:09:44,850 --> 00:09:46,139 program these things. And I think the 292 00:09:46,140 --> 00:09:48,090 develop of the test, I think so, too. 293 00:09:49,860 --> 00:09:52,589 So the trolley problem kind of 294 00:09:52,590 --> 00:09:54,989 pits consequentialist with deontological 295 00:09:54,990 --> 00:09:56,819 ethics. Really, really difficult to 296 00:09:56,820 --> 00:09:58,979 resolve why and how you should 297 00:09:58,980 --> 00:10:01,049 kill X or Y people. 298 00:10:01,050 --> 00:10:02,759 So other scholars have thought about 299 00:10:02,760 --> 00:10:04,919 other approaches where you have 300 00:10:04,920 --> 00:10:06,809 different choices you can make of these 301 00:10:06,810 --> 00:10:08,219 terrible choices you have to make. 302 00:10:08,220 --> 00:10:10,529 You make a list of them and you assign 303 00:10:10,530 --> 00:10:13,499 numerical values to the outcomes. 304 00:10:13,500 --> 00:10:16,229 And then a computer program will 305 00:10:16,230 --> 00:10:19,199 will pass those outcomes and decide 306 00:10:19,200 --> 00:10:21,269 quite randomly which one to 307 00:10:21,270 --> 00:10:22,469 choose. 308 00:10:22,470 --> 00:10:24,089 And this I've kind of said, does God 309 00:10:24,090 --> 00:10:25,889 exist? Because the Pascaline wager that 310 00:10:25,890 --> 00:10:28,229 this is based on actually comes from 311 00:10:28,230 --> 00:10:30,479 an idea in the sixteen hundreds about 312 00:10:30,480 --> 00:10:32,339 the existence of God. 313 00:10:32,340 --> 00:10:33,659 And it's actually kind of interesting 314 00:10:33,660 --> 00:10:36,089 because it's about how how would you feel 315 00:10:36,090 --> 00:10:38,009 if you discovered that God did or did not 316 00:10:38,010 --> 00:10:38,909 exist? 317 00:10:38,910 --> 00:10:40,349 There are feelings you would have about 318 00:10:40,350 --> 00:10:42,479 them. So the Pascaline wager is a way 319 00:10:42,480 --> 00:10:44,759 to rank numerically 320 00:10:44,760 --> 00:10:46,019 these different outcomes. 321 00:10:46,020 --> 00:10:47,579 So Vikram Bhargava thinks that you could 322 00:10:47,580 --> 00:10:50,369 apply the Pascaline approach to 323 00:10:50,370 --> 00:10:52,619 these terrible outcomes in the case of a 324 00:10:52,620 --> 00:10:53,620 car accident. 325 00:10:55,050 --> 00:10:57,329 And then we have ethics board, which is a 326 00:10:57,330 --> 00:11:00,599 newer idea by the Etzioni is 327 00:11:00,600 --> 00:11:02,939 they looked at users 328 00:11:02,940 --> 00:11:05,789 of Nest, the house, the thermostat 329 00:11:05,790 --> 00:11:08,069 heat regulation system in the home, and 330 00:11:08,070 --> 00:11:10,829 they said you could develop 331 00:11:10,830 --> 00:11:12,929 A.I. assisted ethics so you could have 332 00:11:12,930 --> 00:11:15,029 a meta bot to rule 333 00:11:15,030 --> 00:11:16,289 the algorithms. 334 00:11:16,290 --> 00:11:18,359 This meta, what would 335 00:11:18,360 --> 00:11:20,339 this ethics board would identify? 336 00:11:20,340 --> 00:11:23,009 Patterns in an individual 337 00:11:23,010 --> 00:11:25,619 individual users nest use. 338 00:11:25,620 --> 00:11:27,899 The assumption is if you're somebody 339 00:11:27,900 --> 00:11:30,359 who cares about the environment, you are 340 00:11:30,360 --> 00:11:33,029 more ethical and you will 341 00:11:33,030 --> 00:11:34,889 you will not use so much heat in your 342 00:11:34,890 --> 00:11:36,389 home. You will be very conscious about 343 00:11:36,390 --> 00:11:38,519 the regulation of temperature 344 00:11:38,520 --> 00:11:40,589 in your home and you will try to use less 345 00:11:40,590 --> 00:11:40,919 energy. 346 00:11:40,920 --> 00:11:42,179 This is seen as an assumption of what 347 00:11:42,180 --> 00:11:43,180 ethics is, 348 00:11:44,460 --> 00:11:45,959 which I actually just kind of started 349 00:11:45,960 --> 00:11:47,639 laughing out loud when I saw this, not 350 00:11:47,640 --> 00:11:49,739 only because of like a bot having to 351 00:11:49,740 --> 00:11:52,019 now oversee algorithms, which is 352 00:11:52,020 --> 00:11:54,629 kind of crazy, but that they say 353 00:11:54,630 --> 00:11:56,699 we want this to be personalized. 354 00:11:56,700 --> 00:11:59,309 We don't you know, we want individuals 355 00:11:59,310 --> 00:12:01,679 to have control and individual moral 356 00:12:01,680 --> 00:12:03,389 decision making to sort of come into this 357 00:12:03,390 --> 00:12:04,739 framework. 358 00:12:04,740 --> 00:12:07,559 But what if you're actually somebody who 359 00:12:07,560 --> 00:12:09,179 you may sort of regulate the heat in your 360 00:12:09,180 --> 00:12:11,249 home, but you may take like a hundred 361 00:12:11,250 --> 00:12:13,439 flights a year or maybe 362 00:12:13,440 --> 00:12:15,179 you're a menopausal woman who actually 363 00:12:15,180 --> 00:12:16,589 just wants to regulate the heat in your 364 00:12:16,590 --> 00:12:18,089 house because you can't deal with the hot 365 00:12:18,090 --> 00:12:19,739 flashes. You know, it may have nothing to 366 00:12:19,740 --> 00:12:21,569 do with how you actually feel about the 367 00:12:21,570 --> 00:12:23,819 environment, but the attorneys think 368 00:12:23,820 --> 00:12:25,200 that ethics boards 369 00:12:26,370 --> 00:12:28,649 could be a way to kind of evolve 370 00:12:28,650 --> 00:12:31,049 moral decision making or kind of find 371 00:12:31,050 --> 00:12:32,669 a way for machines to learn about moral 372 00:12:32,670 --> 00:12:33,839 decision making. 373 00:12:33,840 --> 00:12:35,309 Now, why all of these are kind of 374 00:12:35,310 --> 00:12:37,379 interesting and are of a set is 375 00:12:37,380 --> 00:12:39,869 because we believe that ethics 376 00:12:39,870 --> 00:12:42,059 is an outcome. We believe that ethics can 377 00:12:42,060 --> 00:12:44,279 be something that software will 378 00:12:44,280 --> 00:12:46,499 that that software can be programed to 379 00:12:46,500 --> 00:12:47,549 do, can learn. 380 00:12:50,130 --> 00:12:52,079 And I like kind of always trying to 381 00:12:52,080 --> 00:12:54,239 introduce Don Oharu in any of my 382 00:12:54,240 --> 00:12:55,739 talks because I think she's really 383 00:12:55,740 --> 00:12:58,079 amazing and 384 00:12:58,080 --> 00:13:00,179 and I have kind of some of my work kind 385 00:13:00,180 --> 00:13:01,979 of tries to look at what does it mean 386 00:13:01,980 --> 00:13:03,869 when we think that these wonderful 387 00:13:03,870 --> 00:13:06,749 machines that we've created will have 388 00:13:06,750 --> 00:13:08,099 the answers for us. 389 00:13:08,100 --> 00:13:09,899 And James Bridle today talked a little 390 00:13:09,900 --> 00:13:11,549 bit about things like machine vision and 391 00:13:11,550 --> 00:13:12,449 computer vision. 392 00:13:12,450 --> 00:13:14,249 What happens when we start creating 393 00:13:14,250 --> 00:13:17,129 systems and environments 394 00:13:17,130 --> 00:13:19,439 for machines to inhabit 395 00:13:19,440 --> 00:13:21,089 and for machines to understand and for 396 00:13:21,090 --> 00:13:23,069 machines to pass. 397 00:13:23,070 --> 00:13:25,679 So I kind of ask myself that 398 00:13:25,680 --> 00:13:27,749 it's not so much about ethics 399 00:13:27,750 --> 00:13:29,789 alone. I mean, ethics are important for 400 00:13:29,790 --> 00:13:30,779 sure. 401 00:13:30,780 --> 00:13:32,429 But what ethics are is sort of 402 00:13:32,430 --> 00:13:33,899 complicated at this point. 403 00:13:33,900 --> 00:13:35,099 But what is the code and what does the 404 00:13:35,100 --> 00:13:36,389 city become when it's made for a 405 00:13:36,390 --> 00:13:37,390 driverless car? 406 00:13:38,580 --> 00:13:40,799 And I came across this video, 407 00:13:42,420 --> 00:13:45,209 which is by a design company called Big. 408 00:13:45,210 --> 00:13:47,039 The name of the video is driverless is 409 00:13:47,040 --> 00:13:48,040 more. 410 00:13:54,380 --> 00:13:55,549 Barbara. 411 00:13:59,260 --> 00:14:00,260 This. 412 00:14:10,660 --> 00:14:11,660 Harry. 413 00:14:15,390 --> 00:14:16,390 This. 414 00:14:53,960 --> 00:14:55,489 So, of course, it's sort of interesting 415 00:14:55,490 --> 00:14:57,679 that there's there are no humans in 416 00:14:57,680 --> 00:14:59,330 any of these scenarios of 417 00:15:00,440 --> 00:15:03,439 the future with driverless cars, and 418 00:15:03,440 --> 00:15:05,719 there's this quote that I also really 419 00:15:05,720 --> 00:15:07,759 like from a paper called Crap Hilarity 420 00:15:07,760 --> 00:15:10,489 Hermeneutics by Florian Kramer, 421 00:15:10,490 --> 00:15:12,379 where he thinks about what happens if we 422 00:15:12,380 --> 00:15:15,019 actually start building systems for 423 00:15:15,020 --> 00:15:16,730 for machines to inhabit. 424 00:15:17,830 --> 00:15:20,029 And yeah, I mean, if 425 00:15:20,030 --> 00:15:21,709 you can read the paper, please do. 426 00:15:21,710 --> 00:15:23,149 It's great. But then this particular 427 00:15:23,150 --> 00:15:25,219 quote is about trying to imagine what the 428 00:15:25,220 --> 00:15:27,169 city looks like when it's made for cars. 429 00:15:27,170 --> 00:15:29,359 And this is not actually such a 430 00:15:29,360 --> 00:15:31,549 distant possibility, because 431 00:15:31,550 --> 00:15:33,709 if you talk to people who are involved in 432 00:15:33,710 --> 00:15:35,899 regulation and industry 433 00:15:35,900 --> 00:15:38,089 and government, this is I 434 00:15:38,090 --> 00:15:39,529 mean, driverless cars, no matter what 435 00:15:39,530 --> 00:15:41,089 Gizmodo tells you, are not going to just 436 00:15:41,090 --> 00:15:42,469 kind of show up on the street. 437 00:15:42,470 --> 00:15:44,989 There's actually like a long process, 438 00:15:44,990 --> 00:15:46,939 long painful process of technology 439 00:15:46,940 --> 00:15:49,729 actually being integrated into society. 440 00:15:49,730 --> 00:15:51,289 And you have to have a lot of law and 441 00:15:51,290 --> 00:15:53,359 regulation as kind of part 442 00:15:53,360 --> 00:15:54,559 of that. And it's really interesting that 443 00:15:54,560 --> 00:15:55,729 way to look at the history of 444 00:15:55,730 --> 00:15:56,779 electricity, which is 445 00:15:58,280 --> 00:15:59,969 electricity and telephones and how it 446 00:15:59,970 --> 00:16:01,429 completely disrupted society. 447 00:16:01,430 --> 00:16:03,109 And I think the driverless cars are kind 448 00:16:03,110 --> 00:16:05,209 of are going to 449 00:16:05,210 --> 00:16:07,429 do similar kinds of things, maybe not so 450 00:16:07,430 --> 00:16:09,079 visibly at first, but I think it's going 451 00:16:09,080 --> 00:16:10,789 to kind of reshape a lot of our social 452 00:16:10,790 --> 00:16:12,109 relations. I mean, you have to think 453 00:16:12,110 --> 00:16:14,509 about, you know, old people 454 00:16:14,510 --> 00:16:16,789 or seven year old children who now have 455 00:16:16,790 --> 00:16:19,339 kind of independent lives and 456 00:16:19,340 --> 00:16:21,139 mobility as well. 457 00:16:23,660 --> 00:16:25,789 And I think the sort of update 458 00:16:25,790 --> 00:16:28,129 on Turing's old question is 459 00:16:28,130 --> 00:16:29,989 not, you know, what do we think of 460 00:16:29,990 --> 00:16:31,549 machines that think, but what do we think 461 00:16:31,550 --> 00:16:33,109 about machines that can learn? 462 00:16:33,110 --> 00:16:34,609 And what we know about machines that 463 00:16:34,610 --> 00:16:36,349 learn right now is that they don't learn 464 00:16:36,350 --> 00:16:37,579 very well. 465 00:16:37,580 --> 00:16:40,399 They learn like like toddlers. 466 00:16:40,400 --> 00:16:42,469 And this is I mean, we've seen that, 467 00:16:42,470 --> 00:16:44,629 you know, machines basically running 468 00:16:44,630 --> 00:16:46,699 machine learning algorithms will replay 469 00:16:46,700 --> 00:16:48,769 a lot of the biases that are embedded in 470 00:16:48,770 --> 00:16:50,539 original data sets that they're that 471 00:16:50,540 --> 00:16:51,589 they're learning from and that they're 472 00:16:51,590 --> 00:16:52,590 based on. 473 00:16:53,630 --> 00:16:55,699 And another great Donna, Haruhiko, 474 00:16:55,700 --> 00:16:57,139 does that. She refers to A.I. 475 00:16:57,140 --> 00:16:59,899 and primates and children as almost minds 476 00:16:59,900 --> 00:17:01,909 to kind of refer to this way in which 477 00:17:01,910 --> 00:17:03,259 that they learn. 478 00:17:03,260 --> 00:17:04,729 So I think we're still very much in the 479 00:17:04,730 --> 00:17:06,529 sort of early stages, but there's a lot 480 00:17:06,530 --> 00:17:08,659 riding on the fact that someday 481 00:17:08,660 --> 00:17:10,549 the software is going to do kind of 482 00:17:10,550 --> 00:17:11,550 amazing things. 483 00:17:12,920 --> 00:17:14,479 But before we get there, sort of the 484 00:17:14,480 --> 00:17:16,189 other register in which we talk about 485 00:17:16,190 --> 00:17:17,599 ethics is also about 486 00:17:19,310 --> 00:17:21,379 about who's going to pay for it and who's 487 00:17:21,380 --> 00:17:22,639 accountable to it. 488 00:17:22,640 --> 00:17:24,529 And I think that the history of crash 489 00:17:24,530 --> 00:17:26,719 testing, the car industry 490 00:17:26,720 --> 00:17:29,209 and crash testing and the concurrent 491 00:17:29,210 --> 00:17:31,639 rise of the insurance industry 492 00:17:31,640 --> 00:17:33,769 is also really pertinent, 493 00:17:33,770 --> 00:17:35,329 because I think that some of these shifts 494 00:17:35,330 --> 00:17:36,229 are happening right now. 495 00:17:36,230 --> 00:17:37,969 And I'm personally hoping to look at more 496 00:17:37,970 --> 00:17:39,949 of this in the coming year or so. 497 00:17:39,950 --> 00:17:42,219 In the 1920s and 30s, like there 498 00:17:42,220 --> 00:17:44,569 are a lot of car crashes happened, 499 00:17:44,570 --> 00:17:46,039 but people don't always know why. 500 00:17:46,040 --> 00:17:48,979 And so car manufacturers would spend 501 00:17:48,980 --> 00:17:51,109 a lot of effort kind of building cars 502 00:17:51,110 --> 00:17:52,789 just to crash them. 503 00:17:52,790 --> 00:17:54,409 And the insurance industry kind of grew 504 00:17:54,410 --> 00:17:56,449 along with this and was able to learn how 505 00:17:56,450 --> 00:17:58,429 to understand how crashes happen. 506 00:17:58,430 --> 00:17:59,839 Now, this also allowed science and 507 00:17:59,840 --> 00:18:01,729 technology to develop because people 508 00:18:01,730 --> 00:18:03,499 understood a lot more about like physics 509 00:18:03,500 --> 00:18:05,459 and the materials that they were using. 510 00:18:05,460 --> 00:18:07,129 And so this was, of course, very 511 00:18:07,130 --> 00:18:09,559 expensive. But in the 1990s, you saw 512 00:18:09,560 --> 00:18:11,869 car manufacturers using 513 00:18:11,870 --> 00:18:14,209 modeling and simulations to actually 514 00:18:14,210 --> 00:18:15,680 feed in a lot of different 515 00:18:17,330 --> 00:18:19,429 variables about crash scenarios 516 00:18:19,430 --> 00:18:21,529 and to understand how crashes were 517 00:18:21,530 --> 00:18:22,639 actually going to happen. 518 00:18:22,640 --> 00:18:24,859 So scholars have actually written about 519 00:18:24,860 --> 00:18:27,569 this idea of road to lab to map. 520 00:18:27,570 --> 00:18:29,539 And so one of them, Nigel Gale, says that 521 00:18:29,540 --> 00:18:32,359 road to lab to math is basically an idea 522 00:18:32,360 --> 00:18:34,039 that you want to be as advanced on the 523 00:18:34,040 --> 00:18:35,899 evolutionary scale of engineering as 524 00:18:35,900 --> 00:18:37,999 possible. Math is the next logical 525 00:18:38,000 --> 00:18:39,829 step in the process over testing on the 526 00:18:39,830 --> 00:18:41,929 road and in the lab, math is much more 527 00:18:41,930 --> 00:18:43,519 cost effective because you don't have to 528 00:18:43,520 --> 00:18:45,419 build preproduction vehicles and then 529 00:18:45,420 --> 00:18:46,549 waste them. 530 00:18:46,550 --> 00:18:47,749 We've got to get in front of the 531 00:18:47,750 --> 00:18:49,669 technology so it doesn't leave us behind. 532 00:18:49,670 --> 00:18:51,199 We have to live and breathe math. 533 00:18:51,200 --> 00:18:53,629 When we do that, we can pass the savings 534 00:18:53,630 --> 00:18:54,650 on to the consumer. 535 00:18:55,910 --> 00:18:58,729 So so, you know what's kind of already 536 00:18:58,730 --> 00:19:00,949 in the works and already being 537 00:19:00,950 --> 00:19:01,950 developed. 538 00:19:03,680 --> 00:19:05,779 And so I'm sort of like quickly moving 539 00:19:05,780 --> 00:19:08,419 on to the next part of what I think about 540 00:19:08,420 --> 00:19:10,009 how we think about ethics or what are we 541 00:19:10,010 --> 00:19:11,479 really talking about when we're talking 542 00:19:11,480 --> 00:19:12,480 about ethics? 543 00:19:13,550 --> 00:19:15,079 There is a lot of interest in how 544 00:19:15,080 --> 00:19:16,609 technology is made now. 545 00:19:16,610 --> 00:19:18,559 People are kind of looking at the man 546 00:19:18,560 --> 00:19:20,299 behind the curtain. It is usually a man 547 00:19:20,300 --> 00:19:22,159 behind the curtain who's actually making 548 00:19:22,160 --> 00:19:23,629 the magic happens somewhere. 549 00:19:23,630 --> 00:19:25,489 Where is the stuff being designed and 550 00:19:25,490 --> 00:19:26,869 what are the sort of politics of 551 00:19:26,870 --> 00:19:27,890 technology design? 552 00:19:29,630 --> 00:19:31,219 On the one hand, of course, you have 553 00:19:31,220 --> 00:19:33,349 things like Diesel Gate, which there have 554 00:19:33,350 --> 00:19:35,179 been some talks here as well about Diesel 555 00:19:35,180 --> 00:19:36,379 Gate, which are really interesting. 556 00:19:36,380 --> 00:19:38,719 Volkswagen's very 557 00:19:38,720 --> 00:19:41,179 blatant cheating on their emissions, 558 00:19:41,180 --> 00:19:43,369 using the defeat device and telling 559 00:19:43,370 --> 00:19:45,109 us about the fact that, you know, 560 00:19:45,110 --> 00:19:47,069 corruption happens in business. 561 00:19:47,070 --> 00:19:49,009 Now, you've got that scenario, but you've 562 00:19:49,010 --> 00:19:50,989 also got a lot of other scenarios just 563 00:19:50,990 --> 00:19:52,969 around the development of technology 564 00:19:52,970 --> 00:19:55,849 itself, and 565 00:19:55,850 --> 00:19:57,799 there are lots of good examples of how 566 00:19:57,800 --> 00:20:00,049 technologies that we experience 567 00:20:00,050 --> 00:20:02,179 today, especially on social media and 568 00:20:02,180 --> 00:20:04,369 consumer devices, betray 569 00:20:04,370 --> 00:20:06,319 the biases of their designers. 570 00:20:07,730 --> 00:20:09,379 But I was really interested to learn just 571 00:20:09,380 --> 00:20:11,389 last week that the Institute for 572 00:20:11,390 --> 00:20:13,419 Electrical and Electronics Engineers I 573 00:20:13,420 --> 00:20:15,259 tabouli came up with a really good 574 00:20:15,260 --> 00:20:18,379 document around ethically aligned design. 575 00:20:18,380 --> 00:20:20,449 It's a 138 page document 576 00:20:20,450 --> 00:20:21,869 and only on page 36. 577 00:20:21,870 --> 00:20:23,479 So if I'd read more, I would tell you a 578 00:20:23,480 --> 00:20:24,889 lot more about it. But it just came out 579 00:20:24,890 --> 00:20:25,890 on Monday. 580 00:20:26,810 --> 00:20:28,909 But what is interesting to 581 00:20:28,910 --> 00:20:30,739 kind of note in this is that there's 582 00:20:30,740 --> 00:20:32,539 something much more serious going on 583 00:20:32,540 --> 00:20:34,099 where there is, I think, a discussion 584 00:20:34,100 --> 00:20:36,289 about looking at the fact that 585 00:20:36,290 --> 00:20:38,479 we cannot embed values in software. 586 00:20:38,480 --> 00:20:40,789 It is really hard to to 587 00:20:40,790 --> 00:20:43,159 align human ethics with 588 00:20:43,160 --> 00:20:44,269 machine programing. 589 00:20:44,270 --> 00:20:45,499 And I think this is the first time I've 590 00:20:45,500 --> 00:20:47,689 actually seen a document that 591 00:20:47,690 --> 00:20:49,669 is so global in its scale and its scope 592 00:20:49,670 --> 00:20:51,019 in the way that it's talking to many 593 00:20:51,020 --> 00:20:52,999 different kinds of people, including 594 00:20:53,000 --> 00:20:54,559 people that, you know, some of us in this 595 00:20:54,560 --> 00:20:56,329 room know, like people like Neales from 596 00:20:56,330 --> 00:20:57,979 Article 19. I mean, it's interesting that 597 00:20:57,980 --> 00:20:59,719 they're actually trying to look at human 598 00:20:59,720 --> 00:21:02,209 rights issues in 599 00:21:02,210 --> 00:21:03,709 sort of what they're calling ethically 600 00:21:03,710 --> 00:21:04,729 aligned design. 601 00:21:04,730 --> 00:21:06,829 So I think that there is going to 602 00:21:06,830 --> 00:21:08,359 be a lot more interest in looking at the 603 00:21:08,360 --> 00:21:09,650 context of production. 604 00:21:10,790 --> 00:21:11,790 As well. 605 00:21:14,210 --> 00:21:15,759 And. 606 00:21:15,760 --> 00:21:17,919 So we've got we've got this 607 00:21:17,920 --> 00:21:19,989 idea that ethics is something that is an 608 00:21:19,990 --> 00:21:22,089 outcome, that software will will 609 00:21:22,090 --> 00:21:23,769 will tell us the answer of what is the 610 00:21:23,770 --> 00:21:24,969 right thing to do. 611 00:21:24,970 --> 00:21:27,189 We've got the idea that maybe ethics 612 00:21:27,190 --> 00:21:29,169 is something that we have to think about 613 00:21:29,170 --> 00:21:31,089 in terms of from the point of production 614 00:21:31,090 --> 00:21:32,499 and who's actually making these things 615 00:21:32,500 --> 00:21:34,209 and what are their intentions in 616 00:21:34,210 --> 00:21:35,210 producing these things. 617 00:21:36,230 --> 00:21:38,019 Or maybe ethics is reduced to a question 618 00:21:38,020 --> 00:21:39,280 of the law and insurance. 619 00:21:40,340 --> 00:21:42,249 But when I kind of try to think about it 620 00:21:42,250 --> 00:21:44,109 for myself, like really, honestly, what's 621 00:21:44,110 --> 00:21:45,759 going to happen and what's this sort of, 622 00:21:45,760 --> 00:21:46,649 you know, great future? 623 00:21:46,650 --> 00:21:48,909 I'm not futurologist at all. 624 00:21:48,910 --> 00:21:51,279 But if you do have to have 625 00:21:51,280 --> 00:21:53,289 some sense of how this is going to pan 626 00:21:53,290 --> 00:21:55,059 out based on what you're already seeing, 627 00:21:55,060 --> 00:21:57,279 I think that we're seeing 628 00:21:57,280 --> 00:21:58,359 well, we're kind of moving towards this 629 00:21:58,360 --> 00:22:00,279 place where we have to think about a 630 00:22:00,280 --> 00:22:02,379 different relationship with some 631 00:22:02,380 --> 00:22:03,939 of these machines that we're making and 632 00:22:03,940 --> 00:22:05,499 we're going to have to be using soon. 633 00:22:05,500 --> 00:22:07,599 And we already have levels of autonomy in 634 00:22:07,600 --> 00:22:09,759 cars like cruise control and 635 00:22:09,760 --> 00:22:11,259 parking. 636 00:22:11,260 --> 00:22:13,389 And the history of cars will, you 637 00:22:13,390 --> 00:22:15,219 know, sort of driving will tell you that 638 00:22:15,220 --> 00:22:17,619 and crashes tells you that machines 639 00:22:17,620 --> 00:22:19,329 are very, very accurate and they are 640 00:22:19,330 --> 00:22:20,919 fantastic at performing certain 641 00:22:20,920 --> 00:22:22,029 computational tasks. 642 00:22:22,030 --> 00:22:23,829 But they are really crap at doing a lot 643 00:22:23,830 --> 00:22:26,139 of things that humans are very good at 644 00:22:26,140 --> 00:22:28,359 in the testing of Google's 645 00:22:28,360 --> 00:22:30,489 self-driving cars, Google cars, 646 00:22:30,490 --> 00:22:32,409 and were constantly being rebranded or 647 00:22:32,410 --> 00:22:34,329 getting into accidents because they were 648 00:22:34,330 --> 00:22:36,879 programed to be to follow the rules. 649 00:22:36,880 --> 00:22:39,249 But as anybody who kind of drives 650 00:22:39,250 --> 00:22:41,379 a car or if you're a taxi 651 00:22:41,380 --> 00:22:43,279 driver in Berlin, perhaps maybe not a 652 00:22:43,280 --> 00:22:44,529 regular German driver, because I think 653 00:22:44,530 --> 00:22:46,839 German drivers really follow the rules. 654 00:22:46,840 --> 00:22:48,939 But where I come from, I mean, like, 655 00:22:48,940 --> 00:22:51,039 you know, if it's an open road, 656 00:22:51,040 --> 00:22:52,479 if it's late at night, you're going to 657 00:22:52,480 --> 00:22:53,799 speed, you're going to, you know, bend 658 00:22:53,800 --> 00:22:55,059 the rules a little bit. And actually, it 659 00:22:55,060 --> 00:22:57,609 seems like human beings do this 660 00:22:57,610 --> 00:23:00,039 and human beings also 661 00:23:00,040 --> 00:23:01,059 do believe robots. 662 00:23:01,060 --> 00:23:02,829 They do believe cars. 663 00:23:02,830 --> 00:23:04,629 So it's actually kind of bizarre that 664 00:23:04,630 --> 00:23:06,489 companies like Volvo would be concerned 665 00:23:06,490 --> 00:23:08,139 about, you know, humans pulling their 666 00:23:08,140 --> 00:23:09,699 poor little autonomous vehicles. 667 00:23:10,930 --> 00:23:13,449 And the history and the story behind 668 00:23:13,450 --> 00:23:15,429 the Tesla car crash, which happened in 669 00:23:15,430 --> 00:23:17,559 May, is about 670 00:23:17,560 --> 00:23:19,749 auto pilot and how 671 00:23:19,750 --> 00:23:21,519 this idea of autopilot, anybody goes up 672 00:23:21,520 --> 00:23:22,659 in a plane is that, oh, you know, the 673 00:23:22,660 --> 00:23:24,819 plane just drives itself under 674 00:23:24,820 --> 00:23:26,499 autopilot. But it's quite different with 675 00:23:26,500 --> 00:23:28,599 driverless cars because what happened was 676 00:23:28,600 --> 00:23:30,669 the test driver 677 00:23:30,670 --> 00:23:32,799 was supposedly watching a Harry 678 00:23:32,800 --> 00:23:35,889 Potter DVD and put the Tesla into 679 00:23:35,890 --> 00:23:38,169 autopilot mode and the car 680 00:23:38,170 --> 00:23:40,719 could not distinguish between 681 00:23:40,720 --> 00:23:42,699 the sky. It was a bright day, the bright 682 00:23:42,700 --> 00:23:45,219 white sky and the 683 00:23:45,220 --> 00:23:47,259 the white side of this one of those big, 684 00:23:47,260 --> 00:23:49,539 long trailer trucks, it just 685 00:23:49,540 --> 00:23:51,669 couldn't tell, which tells you something 686 00:23:51,670 --> 00:23:53,439 about the computer vision in Mobileye, 687 00:23:53,440 --> 00:23:55,629 which is in Tesla. But anyway, so 688 00:23:55,630 --> 00:23:57,879 the car went into the big trailer truck 689 00:23:57,880 --> 00:23:59,439 because it thought it was just the sky. 690 00:24:00,730 --> 00:24:02,829 And so that kind of 691 00:24:02,830 --> 00:24:04,629 made, you know, sort of brought this idea 692 00:24:04,630 --> 00:24:06,729 of autopilot on out in 693 00:24:06,730 --> 00:24:08,769 a big way. And Tesla's kind of going to a 694 00:24:08,770 --> 00:24:10,329 lot of pains to say that, you know, 695 00:24:10,330 --> 00:24:12,159 autopilot does not mean your hands are 696 00:24:12,160 --> 00:24:13,659 off the wheel. The human has to be in the 697 00:24:13,660 --> 00:24:14,660 loop constantly. 698 00:24:15,880 --> 00:24:18,279 Even the German government 699 00:24:18,280 --> 00:24:20,439 recently just told Tesla that if they 700 00:24:20,440 --> 00:24:22,569 were when they bring their cars into 701 00:24:22,570 --> 00:24:24,429 this country, they have to change the 702 00:24:24,430 --> 00:24:26,319 world. They cannot use the word autopilot 703 00:24:26,320 --> 00:24:27,969 because it is too misleading. 704 00:24:27,970 --> 00:24:30,279 But I think this kind of thinking about 705 00:24:30,280 --> 00:24:32,379 autopilot and 706 00:24:32,380 --> 00:24:34,209 what is the kind of relationship with 707 00:24:34,210 --> 00:24:36,369 this sort of handshake that we have with 708 00:24:36,370 --> 00:24:38,559 machines is is something 709 00:24:38,560 --> 00:24:40,149 that's going to have to be explored more 710 00:24:40,150 --> 00:24:42,459 and more. It's not just about regulation 711 00:24:42,460 --> 00:24:44,349 and lanes and zoning for the car to 712 00:24:44,350 --> 00:24:45,849 actually be on the road, but maybe other 713 00:24:45,850 --> 00:24:47,889 kinds of licensing, maybe other kinds of 714 00:24:47,890 --> 00:24:50,049 personal insurance as well if 715 00:24:50,050 --> 00:24:52,239 you have to actually inhabit a space with 716 00:24:52,240 --> 00:24:53,829 with machines like this quite closely. 717 00:24:57,080 --> 00:24:59,119 So I was actually maybe I have time to 718 00:24:59,120 --> 00:25:00,379 talk about the ethical fund, which is 719 00:25:00,380 --> 00:25:02,569 sort of sideways, but it's this it's 720 00:25:02,570 --> 00:25:04,489 this kind of speculative project by these 721 00:25:04,490 --> 00:25:06,529 to Shanghai based French artists. 722 00:25:06,530 --> 00:25:08,719 And it's it's trying to kind 723 00:25:08,720 --> 00:25:10,189 of like, you know, poke fun at the idea 724 00:25:10,190 --> 00:25:11,719 that the machine will have the idea. 725 00:25:11,720 --> 00:25:13,279 But it's also about design. 726 00:25:13,280 --> 00:25:15,379 And can you design ethics into machines 727 00:25:15,380 --> 00:25:17,509 so they have this little just a fan 728 00:25:17,510 --> 00:25:19,339 and you can have settings on the fan 729 00:25:20,360 --> 00:25:21,409 of people. 730 00:25:21,410 --> 00:25:22,909 So two people come and sit in front of 731 00:25:22,910 --> 00:25:24,259 the fan and the fan has to make a 732 00:25:24,260 --> 00:25:26,899 decision about which way it will turn 733 00:25:26,900 --> 00:25:29,059 and who it will fan. 734 00:25:29,060 --> 00:25:31,309 So you can put in values about the people 735 00:25:31,310 --> 00:25:32,779 that, you know, so-and-so is a Christian 736 00:25:32,780 --> 00:25:34,009 or a Muslim or a Buddhist. 737 00:25:34,010 --> 00:25:36,679 So and so has this kind of education. 738 00:25:36,680 --> 00:25:38,689 This is how tall they are, how fat they 739 00:25:38,690 --> 00:25:40,429 are, how thin they are. 740 00:25:40,430 --> 00:25:42,679 And then these variables are sent to 741 00:25:42,680 --> 00:25:45,139 mechanical tokers Mechanical Turk workers 742 00:25:45,140 --> 00:25:47,239 in Dubai or Capetown or Washington, 743 00:25:47,240 --> 00:25:49,309 D.C. And then these humans make a 744 00:25:49,310 --> 00:25:51,379 decision about which way the fan could 745 00:25:51,380 --> 00:25:53,479 turn, not being in that context at 746 00:25:53,480 --> 00:25:54,949 all or seeing anything about those 747 00:25:54,950 --> 00:25:56,299 people. 748 00:25:56,300 --> 00:25:58,369 But this is I mean, I would like 749 00:25:58,370 --> 00:26:00,829 to think that we have this moment of 750 00:26:00,830 --> 00:26:02,389 love in Greece with machines where we 751 00:26:02,390 --> 00:26:04,129 work together really smoothly. 752 00:26:04,130 --> 00:26:05,749 But I think before that happens, there's 753 00:26:05,750 --> 00:26:07,219 actually going to be a lot of like really 754 00:26:07,220 --> 00:26:08,779 rubbish things like this where, you know, 755 00:26:08,780 --> 00:26:10,309 Mechanical Turk goes halfway around the 756 00:26:10,310 --> 00:26:12,649 world will make a decision about 757 00:26:12,650 --> 00:26:13,969 what a machine should do. 758 00:26:16,600 --> 00:26:18,549 And so I'm kind of coming to the end of 759 00:26:18,550 --> 00:26:20,829 this and I'm thinking 760 00:26:20,830 --> 00:26:22,899 about other sorts of approaches to ethics 761 00:26:22,900 --> 00:26:25,029 as well, that it's not something 762 00:26:25,030 --> 00:26:27,309 that software can tell you the answer to. 763 00:26:27,310 --> 00:26:29,469 It's not something that insurance or 764 00:26:29,470 --> 00:26:31,059 even just kind of, you know, car 765 00:26:31,060 --> 00:26:32,349 manufacturers will come up with. 766 00:26:32,350 --> 00:26:34,239 But that ethics gets produced 767 00:26:34,240 --> 00:26:36,759 contextually in different situations 768 00:26:36,760 --> 00:26:38,829 to a number of different actors 769 00:26:38,830 --> 00:26:41,209 and factors that are human and non-human. 770 00:26:41,210 --> 00:26:43,569 And McNerney has this great quote about 771 00:26:43,570 --> 00:26:45,160 it. And I love the fact that he calls it 772 00:26:46,180 --> 00:26:47,859 it is not a test to be passed or a 773 00:26:47,860 --> 00:26:49,839 culture to be interrogated, but a complex 774 00:26:49,840 --> 00:26:51,999 social and cultural achievement, 775 00:26:52,000 --> 00:26:53,669 which is kind of music to like, you know, 776 00:26:53,670 --> 00:26:55,359 a science and technology study scholar 777 00:26:55,360 --> 00:26:57,339 like myself. But it's just like it's 778 00:26:57,340 --> 00:26:58,689 really difficult if you have to think 779 00:26:58,690 --> 00:27:01,659 about how does this happen practically 780 00:27:01,660 --> 00:27:03,729 how does. And so I'm 781 00:27:03,730 --> 00:27:05,499 kind of interested in seeing can you 782 00:27:05,500 --> 00:27:07,629 actually have used design fiction as 783 00:27:07,630 --> 00:27:09,309 a methodology to get people to think 784 00:27:09,310 --> 00:27:10,869 through how this will happen? 785 00:27:10,870 --> 00:27:13,149 So I recently had an opportunity to work 786 00:27:13,150 --> 00:27:15,459 with some programmers and designers 787 00:27:15,460 --> 00:27:16,899 working with a company that I called 788 00:27:16,900 --> 00:27:18,459 Bellino Tech. 789 00:27:18,460 --> 00:27:20,529 They said that I had to sign an NDA after 790 00:27:20,530 --> 00:27:22,599 the workshop, but the paperwork 791 00:27:22,600 --> 00:27:24,339 was a bit slow and they forgot to give me 792 00:27:24,340 --> 00:27:26,349 the NDA to sign so I could tell you who 793 00:27:26,350 --> 00:27:28,239 they are. But I don't want to because 794 00:27:28,240 --> 00:27:29,289 they're really nice people. 795 00:27:29,290 --> 00:27:30,290 I shouldn't tell you who they are. 796 00:27:31,390 --> 00:27:32,589 So I got to work with this group of 797 00:27:32,590 --> 00:27:34,149 people and put them through a series of 798 00:27:34,150 --> 00:27:36,339 exercises and they work 799 00:27:36,340 --> 00:27:39,099 on building technology for autonomous 800 00:27:39,100 --> 00:27:41,739 vehicles or kind of 801 00:27:41,740 --> 00:27:43,239 so this is very much in their world. 802 00:27:43,240 --> 00:27:44,409 And I didn't realize. But these are 803 00:27:44,410 --> 00:27:45,819 questions they have to think about every 804 00:27:45,820 --> 00:27:47,889 day that what if you had to 805 00:27:47,890 --> 00:27:50,049 build a map earlier for parents 806 00:27:50,050 --> 00:27:51,879 and children to share information about 807 00:27:51,880 --> 00:27:53,769 children's independent journeys in 808 00:27:53,770 --> 00:27:56,109 autonomous driverless cars or 809 00:27:56,110 --> 00:27:58,389 the second situation, which is actually 810 00:27:58,390 --> 00:27:59,919 kind of happening, that whatever route 811 00:27:59,920 --> 00:28:02,049 finding app wants to avoid 812 00:28:02,050 --> 00:28:03,069 high crime neighborhoods. 813 00:28:03,070 --> 00:28:04,509 But the databases for high crime 814 00:28:04,510 --> 00:28:07,029 neighborhoods are already so 815 00:28:07,030 --> 00:28:09,129 racially loaded and come from such sort 816 00:28:09,130 --> 00:28:11,319 of suspect sources that it's going to 817 00:28:11,320 --> 00:28:13,239 the car is going to maneuver itself away 818 00:28:13,240 --> 00:28:13,539 from. 819 00:28:13,540 --> 00:28:15,249 You know, we know what high crime 820 00:28:15,250 --> 00:28:16,250 neighborhoods are. 821 00:28:17,290 --> 00:28:19,419 So, I mean, I'm 822 00:28:19,420 --> 00:28:21,249 still in the process of actually writing 823 00:28:21,250 --> 00:28:22,989 up the notes and thinking about what they 824 00:28:22,990 --> 00:28:24,159 talked about. 825 00:28:24,160 --> 00:28:26,289 But what I can say is that I was so 826 00:28:26,290 --> 00:28:27,369 struck by how 827 00:28:28,870 --> 00:28:30,819 thoughtful people were, and these are all 828 00:28:30,820 --> 00:28:32,589 programmers who work in big technology 829 00:28:32,590 --> 00:28:34,509 companies or kind of allied with car 830 00:28:34,510 --> 00:28:36,729 companies. And people 831 00:28:36,730 --> 00:28:38,919 want to find ways to be political about 832 00:28:38,920 --> 00:28:40,629 these things. Engineers and technologists 833 00:28:40,630 --> 00:28:42,369 and scientists want to find ways to 834 00:28:42,370 --> 00:28:44,649 actually say, no, this is terrible. 835 00:28:44,650 --> 00:28:46,329 I see what the implications are of what 836 00:28:46,330 --> 00:28:48,159 I'm doing and I want to find a way to 837 00:28:48,160 --> 00:28:50,289 push back. But I am part of a much bigger 838 00:28:50,290 --> 00:28:52,359 corporate machine. At the end of the day, 839 00:28:52,360 --> 00:28:54,490 I'm paid money to build this technology. 840 00:28:55,720 --> 00:28:57,849 So I think that it's I mean, 841 00:28:57,850 --> 00:28:59,259 I would like to do much more of these 842 00:28:59,260 --> 00:29:00,879 kinds of workshops and using design 843 00:29:00,880 --> 00:29:01,880 fiction as an idea. 844 00:29:05,760 --> 00:29:07,739 And as a methodology and process, excuse 845 00:29:07,740 --> 00:29:09,869 me, and I'm going to end 846 00:29:09,870 --> 00:29:12,299 there because I'm out of time and I 847 00:29:12,300 --> 00:29:13,319 really need to drink some water. 848 00:29:13,320 --> 00:29:14,320 Thank you. 849 00:29:21,760 --> 00:29:24,039 Thanks a lot, Maya, so take 850 00:29:24,040 --> 00:29:25,199 your time to drink something. 851 00:29:25,200 --> 00:29:27,369 You have time for, I think, one or two 852 00:29:27,370 --> 00:29:28,689 questions. 853 00:29:28,690 --> 00:29:29,979 OK, perfect. 854 00:29:29,980 --> 00:29:31,749 And we also have a signal angel in the 855 00:29:31,750 --> 00:29:33,549 room. Who is it? Please raise your hand. 856 00:29:34,720 --> 00:29:36,009 I would say OK. 857 00:29:36,010 --> 00:29:37,089 Yes, please. 858 00:29:37,090 --> 00:29:39,039 Is Kristin OK? 859 00:29:39,040 --> 00:29:41,829 So my question is a bit general. 860 00:29:41,830 --> 00:29:45,579 But when it comes to responsibility 861 00:29:45,580 --> 00:29:47,680 in developing systems that are 862 00:29:48,790 --> 00:29:50,889 very critical 863 00:29:50,890 --> 00:29:53,079 on ethics, for example, 864 00:29:53,080 --> 00:29:55,809 a crime rating, algorithms 865 00:29:55,810 --> 00:29:57,939 that that convicted people 866 00:29:57,940 --> 00:29:58,940 and stuff like that, 867 00:30:00,040 --> 00:30:01,749 we have messed up a lot there. 868 00:30:01,750 --> 00:30:03,129 This is just one instance. 869 00:30:03,130 --> 00:30:05,889 But just assume some ethically relevant 870 00:30:05,890 --> 00:30:07,689 program or algorithm. 871 00:30:07,690 --> 00:30:10,689 Anybody that is involved in developing 872 00:30:10,690 --> 00:30:12,039 that algorithm or developing the 873 00:30:12,040 --> 00:30:14,319 technology has some stake 874 00:30:14,320 --> 00:30:15,519 in responsibility. 875 00:30:15,520 --> 00:30:17,739 What would be your answer to 876 00:30:17,740 --> 00:30:20,079 explain this kind of responsibility 877 00:30:20,080 --> 00:30:22,929 and how to untangle 878 00:30:22,930 --> 00:30:25,239 the other responsibility that goes into 879 00:30:25,240 --> 00:30:26,240 this process? 880 00:30:27,100 --> 00:30:29,229 The short answer is I don't 881 00:30:29,230 --> 00:30:30,069 know. 882 00:30:30,070 --> 00:30:31,659 But I think that and I don't think 883 00:30:31,660 --> 00:30:33,969 anybody knows. I mean, you know, 884 00:30:33,970 --> 00:30:35,979 anybody who kind of works on these kinds 885 00:30:35,980 --> 00:30:38,199 of projects or any kind of projects knows 886 00:30:38,200 --> 00:30:40,449 that there's millions of lines of code 887 00:30:40,450 --> 00:30:42,459 in these things. There's millions of, you 888 00:30:42,460 --> 00:30:44,679 know, discussions about how 889 00:30:44,680 --> 00:30:45,759 a project should evolve. 890 00:30:45,760 --> 00:30:47,109 I mean, like I've been reading a lot 891 00:30:47,110 --> 00:30:48,729 about, like the history of terrible 892 00:30:48,730 --> 00:30:49,929 incidents and crashes, like the 893 00:30:49,930 --> 00:30:52,899 Challenger space shuttle crash in 1986. 894 00:30:52,900 --> 00:30:54,909 And that was something that happened in 895 00:30:54,910 --> 00:30:56,230 one of the best resourced 896 00:30:57,670 --> 00:30:59,799 NASA in, you know, at a time 897 00:30:59,800 --> 00:31:01,629 when there was a lot of money given to 898 00:31:01,630 --> 00:31:03,789 the space race 899 00:31:03,790 --> 00:31:04,719 that happened. 900 00:31:04,720 --> 00:31:06,979 That crash happened because 901 00:31:06,980 --> 00:31:09,069 of bureaucracy and tensions between 902 00:31:09,070 --> 00:31:11,469 bureaucrats and scientists about 903 00:31:11,470 --> 00:31:13,629 the offerings on the side of the space 904 00:31:13,630 --> 00:31:16,209 shuttle, which they knew would break 905 00:31:16,210 --> 00:31:17,949 under certain climatic conditions. 906 00:31:17,950 --> 00:31:20,379 So there's a whole book just called 907 00:31:20,380 --> 00:31:21,789 the not about this thing called the 908 00:31:21,790 --> 00:31:24,010 Normalization of Deviance, that 909 00:31:25,030 --> 00:31:27,189 basically it's very hard to identify 910 00:31:27,190 --> 00:31:29,229 who's responsible, but you have to find a 911 00:31:29,230 --> 00:31:30,999 way to do it. And you can't say that or 912 00:31:31,000 --> 00:31:33,249 this one program or did this thing. 913 00:31:33,250 --> 00:31:35,829 So that's why this idea that technology 914 00:31:35,830 --> 00:31:37,659 ethics have to be thought of as something 915 00:31:37,660 --> 00:31:39,609 that are kind of produced contextually 916 00:31:39,610 --> 00:31:41,739 and you have to actually start kind of 917 00:31:41,740 --> 00:31:44,049 laying out and maybe waiting 918 00:31:44,050 --> 00:31:45,050 and reading. 919 00:31:46,090 --> 00:31:47,619 So in a way, this kind of modeling 920 00:31:47,620 --> 00:31:50,019 approach is inevitable at one level. 921 00:31:50,020 --> 00:31:52,179 So I think that people have to just 922 00:31:52,180 --> 00:31:54,279 try something else apart aside from 923 00:31:54,280 --> 00:31:56,499 just saying, oh, ethics is an answer, 924 00:31:56,500 --> 00:31:58,119 it's killed this person or that person. 925 00:31:58,120 --> 00:31:59,469 It's not ethics. 926 00:31:59,470 --> 00:32:00,849 If you think of it as a framework for 927 00:32:00,850 --> 00:32:02,859 living well and dying well, it's not 928 00:32:02,860 --> 00:32:04,299 about killing well, which is actually 929 00:32:04,300 --> 00:32:05,599 what you see with drone warfare. 930 00:32:05,600 --> 00:32:06,759 And there's a term for this Niekro 931 00:32:06,760 --> 00:32:08,229 ethics, Gregoire washerman who talks 932 00:32:08,230 --> 00:32:10,929 about this, that it's 933 00:32:10,930 --> 00:32:11,949 it's really complex. 934 00:32:11,950 --> 00:32:13,749 I don't know the answer, but I think we 935 00:32:13,750 --> 00:32:14,859 should try to figure it out. 936 00:32:16,820 --> 00:32:18,949 Yes, we have time for one 937 00:32:18,950 --> 00:32:21,629 last question. I'm sorry, but, you 938 00:32:21,630 --> 00:32:23,449 know, we can talk after if that's me, 939 00:32:23,450 --> 00:32:25,549 then, yeah, I. 940 00:32:26,950 --> 00:32:28,190 So my question is. 941 00:32:30,110 --> 00:32:32,119 I believe that this might be the first 942 00:32:32,120 --> 00:32:34,279 time that we have forced to agree on 943 00:32:34,280 --> 00:32:36,619 on a common set of ethics, 944 00:32:36,620 --> 00:32:38,359 I think that is one of the big topics 945 00:32:38,360 --> 00:32:40,639 that that somehow the car manufacturers 946 00:32:40,640 --> 00:32:42,739 are trying to avoid, because that would 947 00:32:42,740 --> 00:32:44,899 mean that they decide the 948 00:32:44,900 --> 00:32:46,789 set of ethics that everybody has to 949 00:32:46,790 --> 00:32:48,799 subscribe to because we are going to 950 00:32:48,800 --> 00:32:51,109 forge this into software and copy 951 00:32:51,110 --> 00:32:52,819 it all over the place. 952 00:32:52,820 --> 00:32:55,039 Is that studied as well or I mean, 953 00:32:55,040 --> 00:32:56,509 I did not see this mentioned in your 954 00:32:56,510 --> 00:32:57,919 talk. 955 00:32:57,920 --> 00:32:59,389 That's that's kind of a problem. 956 00:32:59,390 --> 00:33:00,390 I think that. 957 00:33:02,310 --> 00:33:04,409 Actually, car manufacturers and I know 958 00:33:04,410 --> 00:33:06,509 one of them I'm kind of talking to, I had 959 00:33:06,510 --> 00:33:07,979 some interactions with one of them, they 960 00:33:07,980 --> 00:33:10,079 will be very cagey and they will tell 961 00:33:10,080 --> 00:33:11,339 you about the trolley problem. 962 00:33:11,340 --> 00:33:13,499 Again, that ethics comes down to how 963 00:33:13,500 --> 00:33:15,389 to program the software to act in the 964 00:33:15,390 --> 00:33:17,219 case of an accident, because I don't 965 00:33:17,220 --> 00:33:19,289 think they know and I think we've had 966 00:33:19,290 --> 00:33:22,109 in the last 50, 60 years like incredible 967 00:33:22,110 --> 00:33:24,569 movements and scholarship, saying 968 00:33:24,570 --> 00:33:26,399 that these kind of very hetero 969 00:33:26,400 --> 00:33:28,379 patriarchal notions of ethics, that it's 970 00:33:28,380 --> 00:33:30,839 about logic and rationality, 971 00:33:30,840 --> 00:33:32,849 these come from a certain kind of place. 972 00:33:32,850 --> 00:33:34,799 Why are we reproducing and replaying 973 00:33:34,800 --> 00:33:37,469 these things that are there incredible 974 00:33:37,470 --> 00:33:39,389 kinds of technology, ethics, anticipatory 975 00:33:39,390 --> 00:33:41,579 ethics, computer ethics, robot ethics 976 00:33:41,580 --> 00:33:43,179 that are all being developed as well. 977 00:33:43,180 --> 00:33:44,939 So it's very curious that they go back 978 00:33:44,940 --> 00:33:47,009 and pick up this very kind of 979 00:33:47,010 --> 00:33:49,079 antiquated idea of ethics. 980 00:33:49,080 --> 00:33:50,879 I think this is what they believe. 981 00:33:50,880 --> 00:33:52,619 But and I have a feeling that's what 982 00:33:52,620 --> 00:33:54,299 they're going to actually test and 983 00:33:54,300 --> 00:33:55,959 program. But they are also nervous. 984 00:33:55,960 --> 00:33:57,479 I mean, I read some of the business news 985 00:33:57,480 --> 00:33:59,549 and they're kind of cagey about 986 00:33:59,550 --> 00:34:01,319 it. I mean, my my understanding is that 987 00:34:01,320 --> 00:34:02,699 they're cagey about it, but I have a 988 00:34:02,700 --> 00:34:03,749 feeling they're going to do something 989 00:34:03,750 --> 00:34:05,639 like this anyway, because the other way 990 00:34:05,640 --> 00:34:06,839 is to hide. 991 00:34:06,840 --> 00:34:08,309 If you actually try to 992 00:34:09,420 --> 00:34:11,579 be more contextual and 993 00:34:11,580 --> 00:34:13,259 process oriented, I don't know. 994 00:34:13,260 --> 00:34:14,698 I'd love to actually work with the 995 00:34:14,699 --> 00:34:16,198 computer scientist, a programmer, and 996 00:34:16,199 --> 00:34:17,609 actually trying to develop a program to 997 00:34:17,610 --> 00:34:20,039 say, could you actually map 998 00:34:20,040 --> 00:34:22,439 just out of for speculation sake 999 00:34:22,440 --> 00:34:24,869 if you know all of the different actors 1000 00:34:24,870 --> 00:34:26,999 and points at which you would see 1001 00:34:27,000 --> 00:34:28,769 responsibility and accountability? 1002 00:34:30,210 --> 00:34:31,408 I don't know. 1003 00:34:31,409 --> 00:34:32,388 Thanks. 1004 00:34:32,389 --> 00:34:34,379 Yes, sir. Thanks again, Maya. 1005 00:34:34,380 --> 00:34:36,718 The machines just told us to stop. 1006 00:34:36,719 --> 00:34:38,099 I'm sorry. 1007 00:34:38,100 --> 00:34:40,379 So thanks again, Maya. 1008 00:34:40,380 --> 00:34:42,388 And let's give her a warm round of glass. 1009 00:34:42,389 --> 00:34:45,198 And thanks for the glass 1010 00:34:45,199 --> 00:34:46,199 today.