1 00:00:23,180 --> 00:00:26,470 *Preroll music* 2 00:00:26,470 --> 00:00:30,590 Herald: Good evening, everybody. The upcoming talk is titled "Listen to Your 3 00:00:30,590 --> 00:00:36,150 Heart: Security and Privacy of Implantable Cardio Foo" and will be delivered by e7p, 4 00:00:36,150 --> 00:00:42,490 who is a Ph.D. student at the Max Planck Institute. and Christoph Saatjohann who is 5 00:00:42,490 --> 00:00:47,190 also a Ph.D. student at the University of Applied Sciences Münster where he also 6 00:00:47,190 --> 00:00:53,540 teaches security in medical devices. This talk also be translated into German. 7 00:00:53,540 --> 00:01:00,660 Dieser Vortrag wird auch simultan übersetzt in Deutsch. And that is also the 8 00:01:00,660 --> 00:01:05,030 extent of my German. I would say, e7p over to you. 9 00:01:05,030 --> 00:01:09,560 e7p: Yeah, thanks a lot for the nice introduction. I hope you can all hear and 10 00:01:09,560 --> 00:01:15,330 see us. OK, so yeah,welcome to the talk "Listen to Your Heart: Security and 11 00:01:15,330 --> 00:01:22,890 Privacy of Implantable Cardio Foo". I'm Endres and as said I'm a Ph.D. student at 12 00:01:22,890 --> 00:01:27,280 the Max Planck Institute for Security and Privacy. My main topic is embedded 13 00:01:27,280 --> 00:01:33,900 security. This topic is a funded project, which is called MediSec. It's a 14 00:01:33,900 --> 00:01:37,860 cooperation between cyber security researchers in Bochum and Münster, as well 15 00:01:37,860 --> 00:01:42,630 as medical technology researchers and staff of the University Hospital in 16 00:01:42,630 --> 00:01:49,650 Münster. And this is funded research. So to start off, we want to give a quick 17 00:01:49,650 --> 00:01:58,630 motivation what our topic is about. So we chose these implantable cardiac 18 00:01:58,630 --> 00:02:05,140 defibrillators or other heart related implantable devices. And there are 19 00:02:05,140 --> 00:02:11,650 different kinds of these, so there's lots of other classical heart pacemakers, which 20 00:02:11,650 --> 00:02:16,050 every one of you should already heard about. Then there's also implantable 21 00:02:16,050 --> 00:02:25,079 defibrillators, which have other applications actually, and there are also 22 00:02:25,079 --> 00:02:31,000 heart monitors just for diagnosis. And yeah, as these implants are inside your 23 00:02:31,000 --> 00:02:37,250 body, they pose a high risk for threats and also they have communication 24 00:02:37,250 --> 00:02:44,549 interfaces that are similar to these of the Internet of Things. Also, we want to 25 00:02:44,549 --> 00:02:51,170 talk a bit about the ethical arguments. So when asking: Why hacking medical devices? 26 00:02:51,170 --> 00:02:56,579 Well, first, the obvious thing is, yeah, there's a long device lifecycle in the 27 00:02:56,579 --> 00:03:02,520 medical sector, presumably because of the strong regulations and required 28 00:03:02,520 --> 00:03:11,780 certifications for medical products. So it's more optimal to keep these devices as 29 00:03:11,780 --> 00:03:23,769 long as possible on the market. And for this reason, it's also sure that the, that 30 00:03:23,769 --> 00:03:38,570 the, that there are open bugs from old hardware or software and, but the 31 00:03:38,570 --> 00:03:43,450 manufacturers need to know about the issues to be able to do something about 32 00:03:43,450 --> 00:03:50,840 it. That's a disclaimer for affected patients. It's independent to the decision 33 00:03:50,840 --> 00:03:58,090 for or against such a device what we talk about. Because after all, these devices 34 00:03:58,090 --> 00:04:09,060 can save your life. OK, let's get a bit more to the details. Also we want to talk 35 00:04:09,060 --> 00:04:15,310 shortly about the responsible disclosure process. When we found out some bugs and 36 00:04:15,310 --> 00:04:22,889 vulnerabilities, we informed all the, the involved companies at least six months 37 00:04:22,889 --> 00:04:32,930 ago. So from now on, it's maybe a year or. So the companies took us serious. They 38 00:04:32,930 --> 00:04:39,130 acknowledged our results and ours and their goal is not to worry any affected 39 00:04:39,130 --> 00:04:46,289 people but to improve the product security. Vulnerable devices are or will 40 00:04:46,289 --> 00:04:53,099 soon be replaced, or at least the firmware gets updated. And yeah, whenever we do 41 00:04:53,099 --> 00:04:58,310 independent security research, it helps to keep the quality of the products higher, 42 00:04:58,310 --> 00:05:06,710 which is in both ours and their interests. And one note is, if you ever find out any 43 00:05:06,710 --> 00:05:12,520 bugs or vulnerabilities in some product please inform the companies first before 44 00:05:12,520 --> 00:05:19,560 publishing anything of it online or anywhere else. OK, let's get started into 45 00:05:19,560 --> 00:05:27,080 the topic. First of all, I want to talk about all the devices in the environment 46 00:05:27,080 --> 00:05:34,580 around the implantable devices. First of all, we have the implants himself. These 47 00:05:34,580 --> 00:05:40,919 little devices, they are not so heavy. They are placed, I think, under the skin 48 00:05:40,919 --> 00:05:47,150 near the heart. I don't know. I'm not from the medical sector, but yeah, inside the 49 00:05:47,150 --> 00:05:57,919 body and they have one or multiple contacts which electrodes connect to. And 50 00:05:57,919 --> 00:06:04,599 these connect them to muscles or the organs, and to their thing. But as there 51 00:06:04,599 --> 00:06:10,949 is no outside connection for configuring or receiving test data or something like 52 00:06:10,949 --> 00:06:17,480 this or events. There is a programing device which is usually located in the 53 00:06:17,480 --> 00:06:25,789 hospital or in the heart clinics. Then there's also a home monitoring station, 54 00:06:25,789 --> 00:06:32,229 which the patient takes home and puts at the bed table, for instance, so it can 55 00:06:32,229 --> 00:06:37,540 receive all the relevant data from the implant every day and transmit relevant 56 00:06:37,540 --> 00:06:44,569 data then to the doctor. This does not happen directly, but over the 57 00:06:44,569 --> 00:06:49,889 manufacturer's infrastructure, the structure and the transmission here is 58 00:06:49,889 --> 00:06:57,860 over the internet usually. But then the doctor can receive all the data over the 59 00:06:57,860 --> 00:07:06,410 internet again, and yeah, so that's all the four big spots where data is 60 00:07:06,410 --> 00:07:14,010 transmitted to and from. And if we have an attacker here, he could try to attack or 61 00:07:14,010 --> 00:07:19,340 find vulnerabilities in any of these four devices, as well as their communication 62 00:07:19,340 --> 00:07:26,770 interfaces and protocols. OK, coming a bit more concrete. So in total, there are 63 00:07:26,770 --> 00:07:34,840 around five major vendors worldwide that develop these implants and other devices 64 00:07:34,840 --> 00:07:46,449 around them, and we look. We try to analyze these three on top here, and yeah. 65 00:07:46,449 --> 00:07:53,050 So we go a bit more in detail what we found out and what we try to analyze here. 66 00:07:53,050 --> 00:08:01,409 So going back to the implants, I already showed it to you. That's maybe how it 67 00:08:01,409 --> 00:08:08,099 looks like from the inside. Not very good to see, I think, with the camera, but 68 00:08:08,099 --> 00:08:15,490 there's also some picture in the slides. And yeah, first of all, these implants 69 00:08:15,490 --> 00:08:20,750 contain the desired functionality. For instance, defibrillator, pacemaker, heart 70 00:08:20,750 --> 00:08:26,400 recorder. And these features have different requirements. For instance, a 71 00:08:26,400 --> 00:08:34,190 defibrillator probably needs more power, and so needs a larger battery or a huge 72 00:08:34,190 --> 00:08:40,910 capacitor, for instance. And a heart monitor doesn't need anything of these. 73 00:08:40,910 --> 00:08:45,400 And of course, all of these need this communication interface, which is realized 74 00:08:45,400 --> 00:08:54,730 over radio frequency. But sometimes it's also only over the inductive coupling, 75 00:08:54,730 --> 00:09:03,029 which is maybe known from RFID. And when looking inside these devices, we see there 76 00:09:03,029 --> 00:09:09,790 are highly customized parts inside, which means there are unlabeled chips, even 77 00:09:09,790 --> 00:09:16,149 unpackaged chips that are directly bonded onto the PCBs. So analysis is quite hard 78 00:09:16,149 --> 00:09:21,990 and difficult. But all of these have in common that there's a small microcomputer 79 00:09:21,990 --> 00:09:30,880 inside that handles everything and also the communication. Yeah. Then there are 80 00:09:30,880 --> 00:09:38,610 these home monitoring units, I just get one here, looks like these, and as I said, 81 00:09:38,610 --> 00:09:48,310 they sit on the bed table and transmit the data on a daily basis to the doctors. They 82 00:09:48,310 --> 00:09:54,709 also need some wireless communication interface to the implant. And when they 83 00:09:54,709 --> 00:09:59,690 got the data, they need to transmit them to the doctor. And this is usually done 84 00:09:59,690 --> 00:10:08,760 over a mobile to mobile GSM or UMTS network. And then the data is sent to the 85 00:10:08,760 --> 00:10:14,459 manufacturer server. And compared to the implants these are based on standard 86 00:10:14,459 --> 00:10:19,850 multipurpose hardware. That means often we will find there are Linux operating 87 00:10:19,850 --> 00:10:28,350 systems and lots of debug ports of serial interfaces or USB. So they are easily 88 00:10:28,350 --> 00:10:34,980 accessible for us. OK. And then we have this hospital programmer. They are used in 89 00:10:34,980 --> 00:10:41,310 the cardiology clinics, are able to configure the implants and also use test 90 00:10:41,310 --> 00:10:47,459 modes in the implants. But also, like the home monitoring, they can read out the 91 00:10:47,459 --> 00:10:53,930 stored events or live data and. Yeah. So they are in the heart clinic and operated 92 00:10:53,930 --> 00:10:59,579 by doctors. And usually these are rented to the hospitals or leased from the 93 00:10:59,579 --> 00:11:05,550 manufacturer. But, however, we could find out that individuals could buy these 94 00:11:05,550 --> 00:11:13,649 second hand on specialized, yeah, specialized platforms similar to eBay, I 95 00:11:13,649 --> 00:11:21,550 would say, for medical devices. And now on to our methodology when analyzing the 96 00:11:21,550 --> 00:11:28,789 devices. So first of all, we thought about goals of a potential attacker. First of 97 00:11:28,789 --> 00:11:39,600 all he mainly would like to influence the implantable device itself. And this can be 98 00:11:39,600 --> 00:11:45,860 done either. This can be mostly done over the interface that the programming device 99 00:11:45,860 --> 00:11:54,140 uses, so to inject malicious firmware in the implant could be one goal of the 100 00:11:54,140 --> 00:11:59,839 potential attacker. Another goal could be to GSM spoof the connection of the home 101 00:11:59,839 --> 00:12:08,170 monitoring box and then dump some some medical data or also privacy related data. 102 00:12:08,170 --> 00:12:15,010 And yeah, when looking at the programing device, one could also think about direct 103 00:12:15,010 --> 00:12:25,190 misuse to use, for example, the test modes already included in the device. So what 104 00:12:25,190 --> 00:12:34,259 research questions do result in this? So the first question is: What is possible 105 00:12:34,259 --> 00:12:40,120 only with the direct interaction with genuine devices, which means that is non- 106 00:12:40,120 --> 00:12:48,980 invasive? And the second question is: How secure are these in general? Like, when 107 00:12:48,980 --> 00:12:54,600 also invasive attacks are allowed. And the third question is: Can we finally 108 00:12:54,600 --> 00:13:03,000 understand all communication protocols or is this rather difficult to do? OK. So now 109 00:13:03,000 --> 00:13:10,430 we look more concretely on these attack vectors, and we do this with the devices 110 00:13:10,430 --> 00:13:17,199 which we got from our partner hospital. And yeah, to start off, we looked at the 111 00:13:17,199 --> 00:13:24,370 Biotronik home monitoring unit. And what we did there was to run a rogue GSM cell. 112 00:13:24,370 --> 00:13:38,019 So we did GSM spoofing with OpenBTS. And this allowed us to intercept data. So this 113 00:13:38,019 --> 00:13:42,890 data, which we got then, was not encrypted, so we could reveal the access 114 00:13:42,890 --> 00:13:50,480 credentials. And the same credentials you could also find out when dumping the 115 00:13:50,480 --> 00:13:57,279 firmware from the microcontroller. And this was then done over the JTAG 116 00:13:57,279 --> 00:14:03,420 interface. This firmware, which we got there, could be reverse engineered with 117 00:14:03,420 --> 00:14:10,110 Ghidra, which is an open source tool for reverse engineering. And what we found 118 00:14:10,110 --> 00:14:16,000 there was also an AES cypher implementation, which is mainly used for 119 00:14:16,000 --> 00:14:23,410 authentication steps. Also, the firmware contained the credentials and thus the 120 00:14:23,410 --> 00:14:31,120 internet domain cm3-homemonitoring.de. But according to the manufacturer, this domain 121 00:14:31,120 --> 00:14:37,670 is only used as an authentication realm. However, they were kind of surprised when 122 00:14:37,670 --> 00:14:44,980 we told them that actually they are using the same domain for other services. But I 123 00:14:44,980 --> 00:14:54,129 hope they won't do this anymore. So, yeah, this should be fine. OK. Next up is the 124 00:14:54,129 --> 00:14:58,239 Medtronic home monitoring unit, the approach there was similar to the 125 00:14:58,239 --> 00:15:05,400 Biotronik home monitoring unit. What we found there was that a spoofing attack was 126 00:15:05,400 --> 00:15:13,000 not quite possible because this Medtronic home monitoring unit uses a mobile to 127 00:15:13,000 --> 00:15:21,829 mobile SIM card, which is on a intranet, not done on an internet or a VPN, could 128 00:15:21,829 --> 00:15:30,500 think about this. And so we couldn't get a connection to the original service. And 129 00:15:30,500 --> 00:15:39,589 however, we found also on a web blog post a documented method to find out about the 130 00:15:39,589 --> 00:15:45,690 encryption password because the firmware of the device is encrypted. And, yeah, 131 00:15:45,690 --> 00:15:51,670 turned out that it was a Linux embedded Linux system, which we could also 132 00:15:51,670 --> 00:15:58,389 influence when opening up and tearing down the device. Taking out, I think it was an 133 00:15:58,389 --> 00:16:04,279 SD card and then overwriting some, some files. And here in the picture, you can 134 00:16:04,279 --> 00:16:12,199 also see where we could display an image on the screen of the device. This was done 135 00:16:12,199 --> 00:16:20,720 using some DBus messages because it was an embedded linux really. So here we've got 136 00:16:20,720 --> 00:16:28,830 also the server backend addresses also in the firmware. But more to that later. The 137 00:16:28,830 --> 00:16:34,560 third device which we analyzed was this Boston scientific programing device. You 138 00:16:34,560 --> 00:16:43,089 can switch the camera so we can see it more clearly here. Yeah. So this rather 139 00:16:43,089 --> 00:16:53,799 huge device we could buy for around 2.000 U.S. dollars from this auction platform. 140 00:16:53,799 --> 00:17:00,000 And we could also tear this down because it's never used any more for real 141 00:17:00,000 --> 00:17:09,829 productive cases. And there we found a hard disk inside. And there is also a 142 00:17:09,829 --> 00:17:17,640 Linux system on it, which I think is from 2002, in the slides. And the device itself 143 00:17:17,640 --> 00:17:24,360 is a Intel Pentium based device, which is designed in 2004, and the software is from 144 00:17:24,360 --> 00:17:34,060 2011. So quite old, right? But yeah. So that's I think the thing about this 145 00:17:34,060 --> 00:17:41,440 device. The Linux kernel, sorry, the Linux system in the device also contained a 146 00:17:41,440 --> 00:17:51,940 window manager, the tiling window manager and using the modifying files or shell 147 00:17:51,940 --> 00:17:59,080 scripts on the hard disk allowed us to also start the twm, the tiling window 148 00:17:59,080 --> 00:18:05,930 manager on certain actions. So from there on, we could simply open up an xterm shell 149 00:18:05,930 --> 00:18:15,710 and then have root rights. OK. So maybe I will show it in the live demonstration 150 00:18:15,710 --> 00:18:23,370 later. Also, we found some region lock configuration file, which we could also 151 00:18:23,370 --> 00:18:33,140 alter to allow to connect to the radio frequency, which implants used around the 152 00:18:33,140 --> 00:18:39,920 world. And as the Linux kernel is so old, probably further exploits are likely 153 00:18:39,920 --> 00:18:47,870 possible. But the device is luckily being replaced right now. That's what Boston 154 00:18:47,870 --> 00:18:57,450 Scientific told us. One nice thing about this device is that we found also a x86 155 00:18:57,450 --> 00:19:04,620 binary, which is called "checkDongle" on the hard disk and this checkDongle is more 156 00:19:04,620 --> 00:19:10,880 binary. It just looks for direct connections on the printer port. And when 157 00:19:10,880 --> 00:19:15,130 reverse engineering the direct connections, we could rebuild a genuine 158 00:19:15,130 --> 00:19:24,260 dongle. OK. So with this dongle we were then able to also change this RF region 159 00:19:24,260 --> 00:19:34,030 setting inside the general menu of the device, but we could also boot into either 160 00:19:34,030 --> 00:19:41,010 the integrated firmware upgrade utility over some special USB drive additionally, 161 00:19:41,010 --> 00:19:51,420 or we could access the BIOS configuration and boot from different devices. This, of 162 00:19:51,420 --> 00:19:56,770 course, could leak stored treatment data and personal data of the patients that are 163 00:19:56,770 --> 00:20:05,450 stored, maybe on the hard disk itself or later on when something is modified. OK, 164 00:20:05,450 --> 00:20:11,590 so now I have prepared a little live demonstration with this programing device. 165 00:20:11,590 --> 00:20:18,950 Maybe the guys from the camera operation can switch to the live feed from the 166 00:20:18,950 --> 00:20:26,210 device itself. We will see what I see here. And first of all, I will quickly 167 00:20:26,210 --> 00:20:37,250 show how the device itself works and I just put this antenna on one implant and 168 00:20:37,250 --> 00:20:44,390 then the interrogate button leads to starting the specific software for the 169 00:20:44,390 --> 00:20:50,990 implant. As you already see the tiling window manager also started, so when we 170 00:20:50,990 --> 00:20:56,580 want to, we can start a xterm terminal and when connected to a keyboard, we can also 171 00:20:56,580 --> 00:21:08,370 type in something and we are root. Also in this standard interface now we can access 172 00:21:08,370 --> 00:21:13,521 some test modes or settings of the implant, but I'm not really into it, so 173 00:21:13,521 --> 00:21:22,060 let's skip this part when setting or doing some stuff here. But what else we could do 174 00:21:22,060 --> 00:21:31,790 now is this security dongle. I just plug it in and then started again with an 175 00:21:31,790 --> 00:21:44,710 attached flash drive. Then it starts this normal BIOS post. And when this is done it 176 00:21:44,710 --> 00:21:50,120 simply boots from this USB flash drive. One special thing about this flash drive 177 00:21:50,120 --> 00:21:58,310 is I had to find one which supports USB 1.1 because the hardware is so old. But 178 00:21:58,310 --> 00:22:06,850 finally, I got it working to boot from this USB drive. And after some while, when 179 00:22:06,850 --> 00:22:13,990 loading all the data, you see, it simply starts a FreeDOS operating system and then 180 00:22:13,990 --> 00:22:22,600 starts Doom. Now we can simply play doom on this programing device from a hospital, 181 00:22:22,600 --> 00:22:31,130 so. Quite interesting, right? OK, I think you can switch back to the slides, please. 182 00:22:31,130 --> 00:22:41,680 OK. So now that was this programming computer. What else is missing, is the 183 00:22:41,680 --> 00:22:48,640 server infrastructure between the home monitoring and the doctor. First of all, 184 00:22:48,640 --> 00:22:55,830 we looked at the home monitoring access to the manufacturer and when looking at the 185 00:22:55,830 --> 00:23:03,140 credentials or rather the HTTP domain or IP address, I don't know, in the home 186 00:23:03,140 --> 00:23:09,920 monitoring system of Medtronic, we were able to access the HTTP web server, which 187 00:23:09,920 --> 00:23:16,360 the data is, I think, using a POST request is transmitted to the server. However, 188 00:23:16,360 --> 00:23:25,510 whatever we sent to the server resulted in a blank page with the status code 200. So 189 00:23:25,510 --> 00:23:29,710 no matter what we sent, right? We could also send some really, really incorrect 190 00:23:29,710 --> 00:23:38,030 data. But it doesn't matter. It just keeps this blank page. So this seems to be a 191 00:23:38,030 --> 00:23:46,200 measure against this misuse. And maybe it's not so bad it like this. However, I 192 00:23:46,200 --> 00:23:52,950 don't know if we looked for any encrypted stuff there. Probably it's only TLS 193 00:23:52,950 --> 00:24:01,340 encrypted or something. OK, and then the doctor also gets the data from the 194 00:24:01,340 --> 00:24:07,430 manufacturer server. So this is also usually done over a Web interface, which 195 00:24:07,430 --> 00:24:13,890 we learned from our partnering hospital. And when looking around there, we thought 196 00:24:13,890 --> 00:24:19,980 it's not that bad because there's a typical login username password 197 00:24:19,980 --> 00:24:27,610 authentication included. And then we stopped there because these are productive 198 00:24:27,610 --> 00:24:32,760 systems and we wouldn't want to do some SQL injections or something like this 199 00:24:32,760 --> 00:24:39,470 because it's a really productive system and probably life-depending monitoring is 200 00:24:39,470 --> 00:24:45,390 running there. So we didn't want to risk anything. Right. So better stop there and 201 00:24:45,390 --> 00:24:56,050 let it be. OK. So but from the first look, it looked quite okayish. OK, so a quick 202 00:24:56,050 --> 00:25:02,410 summary about all these findings on the technical aspect. There are several 203 00:25:02,410 --> 00:25:10,090 security vulnerabilities in different devices. Yeah, sure, patients could be 204 00:25:10,090 --> 00:25:16,110 harmed, when therapy relevant data was manipulated. However, usually there is a 205 00:25:16,110 --> 00:25:21,990 doctor in between. So whenever the doctor gets some information that something's 206 00:25:21,990 --> 00:25:29,420 wrong or something like this, and probably he would look and find out what is wrong. 207 00:25:29,420 --> 00:25:37,300 And yeah, we also found out that it's possible to access medical devices. So, 208 00:25:37,300 --> 00:25:43,110 yeah, we got this programing computer for 2.000 U.S. dollars, which clearly shows 209 00:25:43,110 --> 00:25:51,540 that maybe not a good practice to simply rent or lease these devices, but maybe 210 00:25:51,540 --> 00:26:00,491 design these at most secure as possible. And maybe some countermeasures, what can 211 00:26:00,491 --> 00:26:08,650 be done to make it better? First of all, regular software update maintenance could 212 00:26:08,650 --> 00:26:14,390 resolve most of these issues. Also, it would be nice to include some medical 213 00:26:14,390 --> 00:26:19,010 professionals in a product engineering phase, because some test modes maybe 214 00:26:19,010 --> 00:26:25,040 aren't that relevant when the implant is finally inserted in the body after 215 00:26:25,040 --> 00:26:33,800 surgery, so then nobody needs these test modes anymore, for example. And last of 216 00:26:33,800 --> 00:26:42,130 all, but not least. Please make use of state of the art cryptography and PKIs and 217 00:26:42,130 --> 00:26:49,120 maybe also open protocols to improve the security and develop something that is as 218 00:26:49,120 --> 00:26:55,030 secure as it gets. OK, so this is, I think, the technical part, and I would 219 00:26:55,030 --> 00:27:00,751 like to hand over to Christoph, who will tell us something about the GDPR request 220 00:27:00,751 --> 00:27:07,910 and responses and nightmares. Christoph Saatjohann: Yeah, thank you. So 221 00:27:07,910 --> 00:27:11,030 my name is Christoph Saatjohann from Münster University of applied sciences, 222 00:27:11,030 --> 00:27:16,031 and I will tell you something about the privaciy part, so privacy stuff, because 223 00:27:16,031 --> 00:27:20,130 as you already heard, there is a lot of data included in this complete ecosystem. 224 00:27:20,130 --> 00:27:24,130 So there is some data flowing from the implantable device to the home monitoring 225 00:27:24,130 --> 00:27:28,920 service and then going farther to the internet to the company of devices here. 226 00:27:28,920 --> 00:27:35,520 Now my question was, OK, what can we do here? How can we take a look into the data 227 00:27:35,520 --> 00:27:39,110 processing here? How can we look into the processes of the company? What would they 228 00:27:39,110 --> 00:27:44,590 do with our data or with the patient data? And we used the GDPR for this, so the GDPR 229 00:27:44,590 --> 00:27:49,540 is the General Data Protection Regulation, it was put in force in 2018. So it's not 230 00:27:49,540 --> 00:27:54,460 so new. During our study it was two or three years old. So we thought the 231 00:27:54,460 --> 00:28:02,631 companies are still, are already prepared for such stuff. Mrs. GDPR, the user in our 232 00:28:02,631 --> 00:28:07,380 case, or the patient, can obtain some information about the processed data and 233 00:28:07,380 --> 00:28:12,780 with the Article 15 of the GDPR the patient can ask about the purpose of the 234 00:28:12,780 --> 00:28:17,630 processing, the categories of the data and the recipients. So, for example, some 235 00:28:17,630 --> 00:28:23,580 subcontracts who will get the data from the patient and compute something that 236 00:28:23,580 --> 00:28:29,420 just to convert it to some PDF or put it on the web interface for the doctors. So 237 00:28:29,420 --> 00:28:35,280 that's some typical part, typical tasks for some subcontractors, so some other 238 00:28:35,280 --> 00:28:41,270 recipients who will get the data. With Article 20, it is possible to get a copy 239 00:28:41,270 --> 00:28:46,320 of the data itself. The patient could ask the company,: Yeah, please give me a copy 240 00:28:46,320 --> 00:28:50,290 of all my own data. I want to look into it or I want to move it to a different 241 00:28:50,290 --> 00:28:57,141 company. And for this moving from one company to a different company, it's 242 00:28:57,141 --> 00:29:03,860 called data portability, the data must be provided in a commonly used, machine 243 00:29:03,860 --> 00:29:08,950 readable format and machine readable format does not mean PDF, for example. For 244 00:29:08,950 --> 00:29:17,490 our topic here, for the measurable things, commonly used formats may be DICOM or HL7 245 00:29:17,490 --> 00:29:24,350 and not PDF. The GDPR also defines the maximum answer time. So every request from 246 00:29:24,350 --> 00:29:29,920 a customer should be answered in a maximum of four weeks. If it's a real complex 247 00:29:29,920 --> 00:29:34,420 thing, a complex request or something like this, it might be extended up to three 248 00:29:34,420 --> 00:29:39,670 months in total , but the customer has to be informed about the extension and also 249 00:29:39,670 --> 00:29:46,150 the reasons for the extension. The last point is said, so GDPR defines two 250 00:29:46,150 --> 00:29:50,460 important roles for the following parts. Also talk here. First is the data 251 00:29:50,460 --> 00:29:55,090 controller. That means the data controller is responsible for the complete data 252 00:29:55,090 --> 00:29:59,860 process. He might want to share this data with some other recipients or 253 00:29:59,860 --> 00:30:05,590 subcontractors or subsidiaries of the company. And then the other recipient is 254 00:30:05,590 --> 00:30:11,690 called the data processor and he processes the data. But responsible for this process 255 00:30:11,690 --> 00:30:15,660 is the data controller. So the important thing here, the data controller is 256 00:30:15,660 --> 00:30:22,180 responsible. Whatever happens, he will, yeah, he has to answer the request. So 257 00:30:22,180 --> 00:30:31,170 with these GDPR, yeah, methods here, we thought about: What can we do? And our 258 00:30:31,170 --> 00:30:35,730 thing was that we acquired some patients with some implanted devices and we sent 259 00:30:35,730 --> 00:30:41,661 some GDPR inquiries in their names. It was a company, so we told the company: OK, we 260 00:30:41,661 --> 00:30:46,270 are patient xy and we want to know something about our data and we want to 261 00:30:46,270 --> 00:30:51,380 have a copy of our own data. And of course, now you can argue, OK, now we are 262 00:30:51,380 --> 00:30:57,331 handing some very sensitive medical data here, so we have to keep our study here, 263 00:30:57,331 --> 00:31:02,080 our case study itself GDPR compliant. So we talked to our data protection officer, 264 00:31:02,080 --> 00:31:07,370 told our study design here. We set up some contracts with the patients so that we are 265 00:31:07,370 --> 00:31:15,040 self GDPR compliant. Hopefully, it works out. So no one, so we haven't got sued. So 266 00:31:15,040 --> 00:31:20,630 I think that worked out. At the end we were waiting for the answers of the 267 00:31:20,630 --> 00:31:24,410 companies and the hospitals and of course, analyze the results. So we looked on the 268 00:31:24,410 --> 00:31:29,800 completeness, we thought about: This dataset, is it complete? Or have the 269 00:31:29,800 --> 00:31:33,610 companies some other data which were not provided? We also looked on the data 270 00:31:33,610 --> 00:31:40,180 security, especially: How is the data transmitted? Do we get this via plain text 271 00:31:40,180 --> 00:31:46,590 email, perhaps like ZIP files or some CD- ROMs? So we looked on this process here, 272 00:31:46,590 --> 00:31:50,340 and of course, we look on the time of the answer. So remember, four weeks is the 273 00:31:50,340 --> 00:31:55,220 maximum time. In some cases, perhaps three months, but standard would be four weeks. 274 00:31:55,220 --> 00:32:01,850 And of course, if required, we sent some follow up queries. Yes. And, as already 275 00:32:01,850 --> 00:32:06,540 said, we are some responsible researchers here, so we also do this responsible 276 00:32:06,540 --> 00:32:10,830 disclosure stuff. So we talked to the companies and discussed some methods, some 277 00:32:10,830 --> 00:32:16,580 process improvements, what can they do or at least what should they do or must do to 278 00:32:16,580 --> 00:32:23,760 be GDPR compliant. So let's take a look at the results here, what we get from our 279 00:32:23,760 --> 00:32:29,670 case study. First vendor was the Biotronik and we sent the first inquiry to the 280 00:32:29,670 --> 00:32:35,570 Biotronik subsidiary, but we learned that we have a wrong contact. So we just took a 281 00:32:35,570 --> 00:32:39,830 data privacy contact from some documents from the hospital. But they wrote back: 282 00:32:39,830 --> 00:32:43,270 Ah, sorry, we are the wrong company, where just the sales company from Biotronik. 283 00:32:43,270 --> 00:32:47,690 Please refer to the different company. Then we wrote a second letter to the 284 00:32:47,690 --> 00:32:51,401 different company. And we got an answer after two months and, now remember, four 285 00:32:51,401 --> 00:32:56,460 weeks, it's not two months, so it was delayed. Sure. But the answer itself was 286 00:32:56,460 --> 00:33:02,200 also a bit unsatisfying for us because Biotronik told us: The device was never 287 00:33:02,200 --> 00:33:06,600 connected to any home monitoring system here, so no personal data is stored at 288 00:33:06,600 --> 00:33:11,710 Biotronik. We asked the patient: Do you ever have some of these home monitoring 289 00:33:11,710 --> 00:33:17,200 devices? And he told us: No, never got this device. So this is a classic example 290 00:33:17,200 --> 00:33:22,340 of a good study design or, in this case, a bad study design. So first of all, get 291 00:33:22,340 --> 00:33:29,280 your context right. And secondly, choose good participants of your study. So this 292 00:33:29,280 --> 00:33:36,010 might be a future work item here. Perhaps choose another different patient, perhaps. 293 00:33:36,010 --> 00:33:41,120 Next company, we wrote to, was Medtronic. You already know it from the other devices 294 00:33:41,120 --> 00:33:47,451 here. And the answer was that we have to send a copy of the ID card, so they wanted 295 00:33:47,451 --> 00:33:54,610 to have an identification verification. The GDPR does not define a really strict 296 00:33:54,610 --> 00:33:59,350 method for the verification or when it is required, when it's mandatory or when not 297 00:33:59,350 --> 00:34:06,230 but in the GDPR it says, it is possible in some cases and we think here we are 298 00:34:06,230 --> 00:34:10,000 dealing here with very sensitive medical personal data. We think this is totally 299 00:34:10,000 --> 00:34:14,250 fine. So identification verification is fine for us, and we think this is a good 300 00:34:14,250 --> 00:34:20,000 thing here that they really check it, that we are the person who is telling the 301 00:34:20,000 --> 00:34:27,110 person. They also recommend us to use the Medtronic secure email system, and first 302 00:34:27,110 --> 00:34:32,030 of all, we had a good impression because it's much better than have some plain text 303 00:34:32,030 --> 00:34:36,169 email and if they are hosting some secure email system on their servers, we said: 304 00:34:36,169 --> 00:34:39,779 OK, that's a good idea, right? We have a TLS secure connection here. Looks 305 00:34:39,779 --> 00:34:44,540 perfectly fine. Let me send some tests emails, but we saw on the email headers 306 00:34:44,540 --> 00:34:50,780 that email is routed to some US servers from the Proofpoint company in the USA. 307 00:34:50,780 --> 00:34:55,399 And here we would say, OK, that's not really good because normally if I'm a 308 00:34:55,399 --> 00:35:00,540 German customer or a European customer and sending some GDPR request to the 309 00:35:00,540 --> 00:35:07,880 Medtronic, Germany or any other EU Medtronic subsidiary, I'm not sure about 310 00:35:07,880 --> 00:35:13,289 or I have no knowledge about that email is routed through the US. And also for the 311 00:35:13,289 --> 00:35:19,559 GDPR compliance we are not sure if this is actually allowed because there are some 312 00:35:19,559 --> 00:35:23,510 discussions about the Safe Harbor thing. So that might be not really GDPR 313 00:35:23,510 --> 00:35:28,710 compliant. In the least it's not good for the user. It's a bad user experience here. 314 00:35:28,710 --> 00:35:36,520 But OK, we will use this anyway, because we think such a device, such a platform is 315 00:35:36,520 --> 00:35:40,700 better than plaintext email, so we sent our inquiry about the system. And the next 316 00:35:40,700 --> 00:35:45,089 point was really surprising to us. So we were waiting for the results. And I looked 317 00:35:45,089 --> 00:35:50,900 into my yeah, we created a GMX free web email service account for this 318 00:35:50,900 --> 00:35:56,380 communication and suddenly I got an email in this system in the GMX email. So the 319 00:35:56,380 --> 00:36:00,670 response from Medtronic was not sent via the secure channel. It was just sending in 320 00:36:00,670 --> 00:36:05,880 plaintext email and we said: OK, what's the point here? So you recommend us to use 321 00:36:05,880 --> 00:36:10,010 the secure system, but they use plaintext email. So this is a thing they really have 322 00:36:10,010 --> 00:36:17,390 to change, sure. Then it goes a bit back and forth, we wrote some emails and they 323 00:36:17,390 --> 00:36:21,630 wanted to have some more information about us. What device do we have, what serial 324 00:36:21,630 --> 00:36:28,109 number and what services we use. And in the end, we got a doc file. So a standard 325 00:36:28,109 --> 00:36:32,809 Word file as attachment of an email and we should write some information in it and 326 00:36:32,809 --> 00:36:38,020 send it back. And from a security point of view, so from our security researcher site 327 00:36:38,020 --> 00:36:43,250 here, we would say that's not the best way to do it because Doc files in the email 328 00:36:43,250 --> 00:36:48,390 attachment, this is a classical use for ransomware or phishing email. So we did 329 00:36:48,390 --> 00:36:53,670 this to get the final data as a final answer, but we would propose to change the 330 00:36:53,670 --> 00:36:59,320 system here. Where we got now the final data. And so we thought, OK, now we really 331 00:36:59,320 --> 00:37:04,430 got some data now. So that's the point where we got some really good stuff here. 332 00:37:04,430 --> 00:37:09,099 But in the end, after this forth and back and creating some accounts on some 333 00:37:09,099 --> 00:37:13,000 systems, they just stated: Oh, so we were the wrong contact. The hospital is 334 00:37:13,000 --> 00:37:17,980 responsible. We are just a data controller in this case. And of course, this was a 335 00:37:17,980 --> 00:37:23,010 bit unsatisfying because we thought: OK, we are now close to the to get the data. 336 00:37:23,010 --> 00:37:30,869 But never got happened. Yeah, so analyzizing, so in the end, in GDPR 337 00:37:30,869 --> 00:37:36,319 compliant might be OK. So we are not into this relationship between Medtronic and 338 00:37:36,319 --> 00:37:41,069 the hospital. So it might be that they have an agreement: Who is the controller, 339 00:37:41,069 --> 00:37:45,260 who was responsible? We couldn't check it as a patient, but of course, the user 340 00:37:45,260 --> 00:37:49,490 experience is not good because you have so many emails and at the end you will get 341 00:37:49,490 --> 00:37:56,840 nothing. But the third one is Boston Scientific. You know Boston Scientific 342 00:37:56,840 --> 00:38:05,620 from the nice Doom device here. And we sent an inquiry to BSC, and we got a 343 00:38:05,620 --> 00:38:09,569 response that they want to have an identification verification, so the same 344 00:38:09,569 --> 00:38:15,400 as Medtronic. So we said, yeah, that's fine, sounds legit. They said also, yeah, 345 00:38:15,400 --> 00:38:19,779 you can use plaintext email. Just send an email to this email address or you can use 346 00:38:19,779 --> 00:38:25,530 our online tool and our patient chooses the email. So from security side, we would 347 00:38:25,530 --> 00:38:31,980 use a platform or a secure platform now. But I can totally understand the patient 348 00:38:31,980 --> 00:38:37,640 because it was a hard copy letter, so a real letter by snake postal mail. And he 349 00:38:37,640 --> 00:38:43,339 should type this really long link, the long URL with some random data. And if you 350 00:38:43,339 --> 00:38:47,920 type one character wrong, you have to do it again and so on. And from the customer 351 00:38:47,920 --> 00:38:52,299 point of view, so the user experience is also very bad here. So no one wants to 352 00:38:52,299 --> 00:38:57,460 really type this in your browser and see if it's correct or not. So Boston 353 00:38:57,460 --> 00:39:02,460 Scientific should use a different system here, some short link system or just have 354 00:39:02,460 --> 00:39:11,260 a short domain and a very short access code, but something better like this one 355 00:39:11,260 --> 00:39:16,349 here. But then we got an email. So it was a plain text email, so not good, of 356 00:39:16,349 --> 00:39:21,090 course, medical data via plain text email not good. So some can now argue: OK, but 357 00:39:21,090 --> 00:39:27,750 our patient started to do it. Our patient started to wrote a plaintext email. But 358 00:39:27,750 --> 00:39:32,589 the common understanding for the GDPR is that even if the customer is asking via 359 00:39:32,589 --> 00:39:37,569 plain text email, the company cannot respond in plain text email. You have to 360 00:39:37,569 --> 00:39:42,520 do something special, something secure. So this is also a thing Boston Scientific 361 00:39:42,520 --> 00:39:47,009 sure changed. But hey, we got seven PDF reports. So our first data now in this 362 00:39:47,009 --> 00:39:52,279 case study after I don't know how many emails and letters, but we got some data. 363 00:39:52,279 --> 00:39:57,119 Then we thought, OK, seven PDF reports and the device is active for three years. That 364 00:39:57,119 --> 00:40:02,950 sounds not OK, that sounds a bit, a bit less for seven, for three years. So we 365 00:40:02,950 --> 00:40:08,450 contacted the doctor, of course, with the consent of the patient, and the doctor 366 00:40:08,450 --> 00:40:12,660 looked into this online portal and saw a lot of data, there were a lot of raw data, 367 00:40:12,660 --> 00:40:18,069 a lot of PDF reports and graphs and full of information. And so we thought: OK, 368 00:40:18,069 --> 00:40:22,440 that's not enough. We got seven reports but the system is full of any other data. 369 00:40:22,440 --> 00:40:27,960 Of course, we go to BSC and tell us: OK, we want to have all the data. BSC 370 00:40:27,960 --> 00:40:35,019 apologized: OK, so we didn't look it up into the home monitoring thing, but you 371 00:40:35,019 --> 00:40:41,119 can have the data, but we need two extra months. So, as I said in the interruption, 372 00:40:41,119 --> 00:40:47,390 that might be OK if it's a really complex thing and then it might be OK. For my, my 373 00:40:47,390 --> 00:40:53,019 understanding is or I have a feeling that they have to implement some export 374 00:40:53,019 --> 00:40:57,309 mechanism to fulfill our request. But OK, if they want to have two months and we got 375 00:40:57,309 --> 00:41:04,049 the data, we were fine with this. Yeah, and now final data. Within this extended 376 00:41:04,049 --> 00:41:08,980 deadline, so they did this in the deadline, in the three months, we got all 377 00:41:08,980 --> 00:41:13,180 the data. And all the data, I mean, really, we got a ton of it.It was a large 378 00:41:13,180 --> 00:41:19,170 zip file with a lot of HL7 data. So it's a raw, digital, machine readable format. And 379 00:41:19,170 --> 00:41:23,450 we got some episode data, also digital as an excel sheet. And we were now really 380 00:41:23,450 --> 00:41:28,960 happy because this was really satisfying. The patient, or in this case, we got all 381 00:41:28,960 --> 00:41:35,559 the data which are really GDPR compliant. So that was the first and only vendor 382 00:41:35,559 --> 00:41:41,590 where we got really our GDPR request fulfilled. Yeah, but last point, we have 383 00:41:41,590 --> 00:41:48,240 to mention now the security. It was not sent directly by email, but we just got 384 00:41:48,240 --> 00:41:52,369 the download link via email. And from the security perspective, that's more or less 385 00:41:52,369 --> 00:41:56,890 the same. So if you have a man in the middle who can read plaintext email, you 386 00:41:56,890 --> 00:42:02,190 can also click on the link in the download email. So, OK, we got the data but the 387 00:42:02,190 --> 00:42:10,059 process here with the security must be improved. We also got one hospital patient 388 00:42:10,059 --> 00:42:17,669 and we send one inquiry to a hospital. There's also some things which we were not 389 00:42:17,669 --> 00:42:22,099 informed of, which we're not aware of it. The hospital was doing the implantation of 390 00:42:22,099 --> 00:42:28,059 the device in our inquiry was five years, the hospital was bankrupt and we were told 391 00:42:28,059 --> 00:42:33,290 that we have to contact a different person of the old owner of the hospital. So the 392 00:42:33,290 --> 00:42:38,390 bankrupt company. We also think that this is in the, yeah, we also think that this 393 00:42:38,390 --> 00:42:42,750 might not be correct because the explantation of the device was done during 394 00:42:42,750 --> 00:42:49,049 the time where the hospital was under the control of the new owner, which we 395 00:42:49,049 --> 00:42:54,010 contacted. So there might be some data for the explantation. But we also write a 396 00:42:54,010 --> 00:43:00,240 letter to the old company, and we got an answer, but also after two months, so 397 00:43:00,240 --> 00:43:08,200 again, here we have to wait two months. A delay GDPR time frame was not fulfilled. 398 00:43:08,200 --> 00:43:12,220 And also the final answer, so we get some, we really got some data, as we can see 399 00:43:12,220 --> 00:43:16,749 here, this handwritten stuff here, but we were missing, for example, the surgery 400 00:43:16,749 --> 00:43:24,309 report. Normally a hospital has to do a lot of documentation for surgery, but we 401 00:43:24,309 --> 00:43:31,170 didn't get this information here, so we just get a part of it. But in summary, I 402 00:43:31,170 --> 00:43:35,170 won't go into all at this point, but you can see a lot of red here, so we had some 403 00:43:35,170 --> 00:43:40,369 plaintext data via email, which is not correct and also not legally, not GDPR 404 00:43:40,369 --> 00:43:45,390 compliant. We have some deadline missed. We have some incomplete data or at the end 405 00:43:45,390 --> 00:43:49,930 it was complete data, but we have to ask a lot and have to ask a doctor who will 406 00:43:49,930 --> 00:43:55,720 double check if the data is correct and we needed often more than one request. And 407 00:43:55,720 --> 00:44:00,299 finally, for the patient, if you want to have your data, if you want to execute 408 00:44:00,299 --> 00:44:03,960 your right, it's a really hard way, as you can see here. And you need a lot of 409 00:44:03,960 --> 00:44:08,980 emails, a lot of time for this. And sometimes it's not really possible because 410 00:44:08,980 --> 00:44:12,939 they say: OK, just go to the hospital, go to a different thing here. We are not 411 00:44:12,939 --> 00:44:20,320 responsible for this. So it's not a good user experience here. And our 412 00:44:20,320 --> 00:44:23,589 recommendation for the companies is and also for the hospitals, they should be 413 00:44:23,589 --> 00:44:30,390 prepared for such stuff because GDPR is now active for more than three years. And 414 00:44:30,390 --> 00:44:36,779 in the next time or some time, they will get some requests and perhaps not from the 415 00:44:36,779 --> 00:44:40,420 friendly researchers but from real patients. And they can, yeah, if they 416 00:44:40,420 --> 00:44:44,619 won't get the answer, they can go to the local authorities and the local 417 00:44:44,619 --> 00:44:49,879 authorities can really,can sue them so they can punish the company with a large 418 00:44:49,879 --> 00:44:57,330 fine. So our recommendations: Be prepared for such stuff. And with this slide, I 419 00:44:57,330 --> 00:45:01,080 want to thank you for your attention and want to close the talk, and we are happy 420 00:45:01,080 --> 00:45:08,079 to answer some questions in the Q&A session. Thanks. 421 00:45:08,079 --> 00:45:13,799 Herald: Thank you very much. I have a bunch of questions from the internet. 422 00:45:13,799 --> 00:45:22,490 First one is, did you analyze binaries only or did you have any access to source 423 00:45:22,490 --> 00:45:26,259 codes? And did you ask any manufacturers... 424 00:45:26,259 --> 00:45:35,140 e7p: Please repeat the question. I didn't get it because of... 425 00:45:35,140 --> 00:45:42,740 Herald: Some technician, please mute the room. I can also (inaudible) Ah, the 426 00:45:42,740 --> 00:45:50,190 question is: Did you analyze binaries only or did you obtain a degree of access to 427 00:45:50,190 --> 00:46:00,480 the source codes? e7p: I'm not sure if the check dongle is 428 00:46:00,480 --> 00:46:06,180 meant but for this one, it was very small and we could analyze it easily using 429 00:46:06,180 --> 00:46:13,249 Ghidra to decompile it and then just see which data needs to be which position in 430 00:46:13,249 --> 00:46:17,760 the parallel port. If that was the question. I think the other binaries from 431 00:46:17,760 --> 00:46:27,109 the home monitoring system or, if you could concretize... From the home 432 00:46:27,109 --> 00:46:32,559 monitoring systems, we first had just a look for strings mainly, for some access 433 00:46:32,559 --> 00:46:41,609 credentials or domain names. And I think we did not that much the decompilation in 434 00:46:41,609 --> 00:46:49,190 the other stuff, but the whole software of the programing computer is in obfuscated 435 00:46:49,190 --> 00:46:57,960 Java. And this is, I don't know, it's a just in time compiled obfuscated Java, and 436 00:46:57,960 --> 00:47:04,339 I didn't find a good way how to do with this. Other questions? 437 00:47:04,339 --> 00:47:15,760 Herald: Thank you. Another question was: How many CVEs did you file from this 438 00:47:15,760 --> 00:47:24,410 research? e7p: So I'm not sure, but some of these 439 00:47:24,410 --> 00:47:30,079 findings were already found by others, and we just didn't get that they were already 440 00:47:30,079 --> 00:47:37,200 reported as CVE or as CISA reports. But I think one or two or three. 441 00:47:37,200 --> 00:47:43,800 Christoph: Yes, there were some CVEs and also our results were linked to some other 442 00:47:43,800 --> 00:47:49,849 CVEs which were already published. The company did some other internal research 443 00:47:49,849 --> 00:47:55,070 thing and internal research got some CVEs. But if you look into the CVE, you cannot 444 00:47:55,070 --> 00:48:01,320 always make it back to the actual vulnerability. But at least for the 445 00:48:01,320 --> 00:48:05,970 programmer here, for the Boston scientific programmer there was the CISA advisory and 446 00:48:05,970 --> 00:48:11,440 a few months back, I think in September or just end of August, there was a CISA 447 00:48:11,440 --> 00:48:16,630 advisory for this programmer and it stated, I don't know, four or five CVEs. 448 00:48:16,630 --> 00:48:21,660 Yeah, and it will be replaced in the next, yeah, hopefully months, because it's also 449 00:48:21,660 --> 00:48:27,239 pretty old. And it's the old generation and the hospitals have to use the newer 450 00:48:27,239 --> 00:48:38,920 generation in the next months. Herald: You mentioned the cooperativeness 451 00:48:38,920 --> 00:48:47,549 of the manufacturers on the subject access request but how cooperative were they on 452 00:48:47,549 --> 00:48:51,940 the technical side, when you tried to report and disclose? 453 00:48:51,940 --> 00:48:59,290 e7p: Yeah, actually, they were quite cooperative when we simply wrote: Hey, we 454 00:48:59,290 --> 00:49:04,940 found this and that. We wrote it to them, I think, first at the press address or 455 00:49:04,940 --> 00:49:13,289 something. And then we were redirected to the internal security group, or, would you 456 00:49:13,289 --> 00:49:20,119 say, the experts. And then we had, I think, Zoom meetings with them and it was 457 00:49:20,119 --> 00:49:25,660 a good cooperation, I would say. Christoph: Really, I think it was a good 458 00:49:25,660 --> 00:49:31,220 communication on the same level here, so it was really the goal from all that we, 459 00:49:31,220 --> 00:49:36,749 of course, don't threaten the patients, but get the product more secure. I think 460 00:49:36,749 --> 00:49:41,420 it is also some point of regulation like the CISA in the US. If they have 461 00:49:41,420 --> 00:49:45,900 vulnerabilities they have to change it so they also get some pressure from the 462 00:49:45,900 --> 00:49:51,799 regulations and they really want to change some things. So that's my impression and 463 00:49:51,799 --> 00:49:56,150 the discussions were really well organized, well structured with a lot of 464 00:49:56,150 --> 00:49:59,730 people who were really deep into the topic. So we asked some questions and we 465 00:49:59,730 --> 00:50:04,019 got really deep insights into the products here. They were very helpful. 466 00:50:04,019 --> 00:50:08,660 e7p: So I think all of the companies offered some jobs for security analysts. 467 00:50:08,660 --> 00:50:12,160 Christoph: Oh yeah! e7p: So anyone who's interested in jobs at 468 00:50:12,160 --> 00:50:14,680 Boston Scientific or Medtronic or Biotronik must... 469 00:50:14,680 --> 00:50:24,589 Christoph: Just hack a device and you will get a job for it. I don't know. 470 00:50:24,589 --> 00:50:33,080 Herald: And the last question I have for you is how difficult this was in terms of 471 00:50:33,080 --> 00:50:39,539 technical skills needed. Was this, did it require really high tech exploits or was 472 00:50:39,539 --> 00:50:45,029 this just unbelievably easy? How, where in the spectrum was it? 473 00:50:45,029 --> 00:50:53,059 e7p: So with the programing device it was, for me, rather difficult because I'm not 474 00:50:53,059 --> 00:51:00,269 so much into 90s or 2000s PC architecture, so I did have to learn something and I 475 00:51:00,269 --> 00:51:07,839 found out about old custom hardware which is interacting over PCI and also with the 476 00:51:07,839 --> 00:51:15,650 home monitoring I had to learn some stuff for embedded Linux and find out where the 477 00:51:15,650 --> 00:51:23,759 network stack is and how it all works, but all in all, I think in maybe one month or 478 00:51:23,759 --> 00:51:29,569 something like this, I could have done it all in all. And to find out. 479 00:51:29,569 --> 00:51:35,789 Christoph: I mean, it actually depends on your knowledge already. So some stuff were 480 00:51:35,789 --> 00:51:39,369 pretty standard like sniffing on some busses on the hardware with a logic 481 00:51:39,369 --> 00:51:43,660 analyzer. If you did this in past, you can also do this on these devices, for 482 00:51:43,660 --> 00:51:47,609 example. But if you never did this, then you have to figure out which pins are for 483 00:51:47,609 --> 00:51:51,579 which bus, how can I identify which bus is used here? How can I read out the EPROM? 484 00:51:51,579 --> 00:51:57,410 How can I read out the memory chips? It highly depends on your previous work and 485 00:51:57,410 --> 00:52:05,819 previous knowledge. For Endres here it's easy. *laughs* 486 00:52:05,819 --> 00:52:17,329 e7p: *laughs* Maybe not. Herald: OK, thank you. I think this 487 00:52:17,329 --> 00:52:24,220 concludes the session. Thank you both for this interesting presentation. So we'll 488 00:52:24,220 --> 00:52:28,119 see how your research will be further approved. 489 00:52:28,119 --> 00:52:33,570 e7p: Thank you. Thanks for being here. Christoph: For being remote there. And 490 00:52:33,570 --> 00:52:39,376 next time, hopefully in life and presence, hopefully. 491 00:52:39,376 --> 00:52:42,170 Herald: Yes, that would be so much better than this. Bye bye. 492 00:52:42,170 --> 00:53:03,050 *postroll music* 493 00:53:03,050 --> 00:53:06,000 Subtitles created by c3subtitles.de in the year 2021. Join, and help us!