1 00:00:00,000 --> 00:00:20,640 *36C3 preroll music* 2 00:00:20,640 --> 00:00:24,340 Herald: Now for the next talk, he has worked for six years in the field of 3 00:00:24,340 --> 00:00:35,739 cryptography for the uni Karlsruhe - Oots. *Applause* 4 00:00:35,739 --> 00:00:43,960 Oots: OK, so thanks for the introduction and welcome to my talk. As our Herald just 5 00:00:43,960 --> 00:00:47,780 said, I've been working in the area of cryptography for the better part of the 6 00:00:47,780 --> 00:00:53,989 last six years. And I noticed that many people today have kind of a mental image 7 00:00:53,989 --> 00:00:58,640 of what cryptography is and what encryption is. But sometimes this mental 8 00:00:58,640 --> 00:01:05,470 image does not really coincide with what is actually going on. So I wanted to give 9 00:01:05,470 --> 00:01:11,140 an introductory talk to cryptography that is available to a broad audience. So to an 10 00:01:11,140 --> 00:01:16,420 audience that does not have any prior exposure to cryptography and maybe even to 11 00:01:16,420 --> 00:01:25,088 an audience that does not have a background in maths or computer science. 12 00:01:25,088 --> 00:01:31,860 So as I said before, this talk is specifically aimed at a non-technical 13 00:01:31,860 --> 00:01:38,710 audience, even though it's 36C3. We probably have a fairly technical audience. 14 00:01:38,710 --> 00:01:44,340 And this is a foundation's talk. So I will not be speaking about fancy research 15 00:01:44,340 --> 00:01:53,620 or cutting edge results. I will be talking about the plain basics. Okay. So apart 16 00:01:53,620 --> 00:02:00,979 from working in cryptography, I enjoy doing applied cryptography, number theory 17 00:02:00,979 --> 00:02:08,270 and pwning, so exploiting all kinds of memory, corruption, bugs and programs. 18 00:02:08,270 --> 00:02:13,430 Okay. This is a picture of my parents' cat. Just because every talk should have 19 00:02:13,430 --> 00:02:25,110 pictures of cats in them. And this is – *applause* Thanks. So this is my checklist 20 00:02:25,110 --> 00:02:30,310 for this talk. The first item I think we already did that. And so for the remainder 21 00:02:30,310 --> 00:02:33,378 of the talk, I want to explain what encryption 22 00:02:33,378 --> 00:02:38,730 is, what it does and what it does not do. 23 00:02:38,730 --> 00:02:45,750 I want to explain authentication, which is some ... which fixes a problem that 24 00:02:45,750 --> 00:02:50,140 encryption does not solve. I want to explain certificates because they help a 25 00:02:50,140 --> 00:02:55,710 lot with both encryption and authentication. And in the end, I want to 26 00:02:55,710 --> 00:03:02,400 explain a little how the things that I'm going to introduce work together and can 27 00:03:02,400 --> 00:03:09,599 be combined to build more useful things. Okay. So let's start with the first point 28 00:03:09,599 --> 00:03:15,650 here. I would like to explain encryption. So encryption is basically a solution to a 29 00:03:15,650 --> 00:03:21,700 problem. So let's talk about the problem we have before we get to the solution. The 30 00:03:21,700 --> 00:03:28,830 problem here is or one of the classical problems is that two parties want to 31 00:03:28,830 --> 00:03:35,070 communicate. So we cryptographers commonly call them Alice and Bob and Alice wants to 32 00:03:35,070 --> 00:03:40,989 send a message to Bob. So in this case, it's just a very stupid message. Just a 33 00:03:40,989 --> 00:03:50,260 simple hello. But cryptography has been used in diplomacy and military for 34 00:03:50,260 --> 00:03:58,546 hundreds of years. So imagine that this message is something more critical. And 35 00:03:58,546 --> 00:04:04,610 the problem we want to solve is that there might be an eavesdropper who is wanting to 36 00:04:04,610 --> 00:04:10,299 listen in on the connection and read the message, read the content that is being 37 00:04:10,299 --> 00:04:19,970 sent from Alice to Bob. And what some people think how cryptography works is 38 00:04:19,970 --> 00:04:24,370 something like the following, which is kind of close to the real thing, but not 39 00:04:24,370 --> 00:04:34,150 really, is that Alice applies some kind of encryption procedure to her plain text 40 00:04:34,150 --> 00:04:39,790 message to produce some random unintelligible gibberish, which we call 41 00:04:39,790 --> 00:04:45,760 the cipher text. And then Alice sends the cipher text to Bob. And Bob has the 42 00:04:45,760 --> 00:04:50,820 decryption procedure, knows how to decrypt. So to invert the encryption 43 00:04:50,820 --> 00:04:58,890 procedure and recover the plain text message. And now the point that some 44 00:04:58,890 --> 00:05:07,800 people get wrong is some people think that the knowledge how to decrypt is actually 45 00:05:07,800 --> 00:05:18,450 secret. But that is not true today. So in 1883, a person named Auguste Kerckhoffs 46 00:05:18,450 --> 00:05:25,909 formulated a couple of principles that ciphers used for military applications 47 00:05:25,909 --> 00:05:32,349 should adhere to, and one of these. And one of these requirements became well- 48 00:05:32,349 --> 00:05:39,510 known as Kerckhoffs principle and it reads "A cipher should not require secrecy 49 00:05:39,510 --> 00:05:46,630 and it should not be a problem if the cipher falls into enemy hands". So 50 00:05:46,630 --> 00:05:52,039 rephrasing this a little, the cipher that you are using should be secure 51 00:05:52,039 --> 00:05:58,649 enough that you can even tell your enemy or the attacker how the encryption process 52 00:05:58,649 --> 00:06:05,469 is working and how the decryption process is working without harming the security of 53 00:06:05,469 --> 00:06:13,070 the encryption. Or to rephrase this yet another time, if the cipher you are using 54 00:06:13,070 --> 00:06:17,080 is so insecure that you cannot tell anyone how it works, then maybe you shouldn't be 55 00:06:17,080 --> 00:06:26,160 using it in the first place. So let's get back to this image. So now, if the 56 00:06:26,160 --> 00:06:32,909 attacker knows how to decrypt the message, then obviously this very simple scheme 57 00:06:32,909 --> 00:06:40,490 does not yield anything useful. And so what people did is people introduced key 58 00:06:40,490 --> 00:06:46,470 to this image. So now the encryption procedure and the decryption 59 00:06:46,470 --> 00:06:55,659 procedure use a key which goes into the computation that is going on. So Alice 60 00:06:55,659 --> 00:07:02,211 does some kind of computation based on the message and the key to produce the cipher 61 00:07:02,211 --> 00:07:07,969 text. And Bob, who has the same key, can invert this encryption operation and 62 00:07:07,969 --> 00:07:13,409 recover the plain text. However, as long as the key is only known to Alice and Bob, 63 00:07:13,409 --> 00:07:21,286 but not to the attacker, the attacker cannot use the decryption procedure. So 64 00:07:21,286 --> 00:07:29,157 one general word here. I will not go into the details of how these boxes operate. 65 00:07:29,157 --> 00:07:35,700 So within these boxes, which represent computations, there is some math or 66 00:07:35,700 --> 00:07:41,440 computer science going on. And I would like to not explain how these things 67 00:07:41,440 --> 00:07:51,360 operate internally in order to keep this talk available to a broad audience. OK, so 68 00:07:51,360 --> 00:07:57,640 a problem that we have here is that Alice and Bob have to agree on the key in 69 00:07:57,640 --> 00:08:07,680 advance. So Alice cannot simply send over the key to Bob, because if she did, then 70 00:08:07,680 --> 00:08:14,005 the attacker who is eavesdropping on the connection learns the key as well as the 71 00:08:14,005 --> 00:08:27,190 message, and then the attacker could just decrypt the same as Bob. OK, so this does 72 00:08:27,190 --> 00:08:34,030 not work. This is a terrible attempt. So for quite some years, actually, until the 73 00:08:34,030 --> 00:08:41,390 70s and 80s of the last century, people were only using this scheme. And this is 74 00:08:41,390 --> 00:08:46,890 what we call symmetric encryption because we could just flip the image around and 75 00:08:46,890 --> 00:08:52,340 Bob could be sending a message to Alice instead because encryption and decryption 76 00:08:52,340 --> 00:08:58,530 uses the same key. And if there is symmetric encryption, you can guess there 77 00:08:58,530 --> 00:09:03,990 is something else which is called asymmetric encryption and there is ... for 78 00:09:03,990 --> 00:09:09,090 asymmetric encryption, there is a pair of keys. One of them is used for 79 00:09:09,090 --> 00:09:15,470 encryption and one of them is used for decryption. And now if we have an 80 00:09:15,470 --> 00:09:20,481 asymmetric encryption scheme, we can do something like the following. So Bob 81 00:09:20,481 --> 00:09:24,528 generates a pair of these keys, one for encryption and one for decryption. And he 82 00:09:24,528 --> 00:09:30,220 keeps the decryption key to himself. This is why the decryption key is called the 83 00:09:30,220 --> 00:09:36,477 secret key. However, Bob can publish the encryption key for everyone to see. So, 84 00:09:36,477 --> 00:09:41,570 for example, it could be put in a kind of public registry like a phonebook or 85 00:09:41,570 --> 00:09:50,700 whatever. And now Bob can send his public key to Alice and an eavesdropper who is 86 00:09:50,700 --> 00:09:56,324 listening in on the connection will learn the public key, but that's not a problem 87 00:09:56,324 --> 00:10:03,390 because the key is public anyway. And now after we have done this, Alice can use 88 00:10:03,390 --> 00:10:09,593 Bob's encryption key to encrypt her message and send that over to Bob. And now 89 00:10:09,593 --> 00:10:14,830 Bob can decrypt the message with his secret decryption key. However, the 90 00:10:14,830 --> 00:10:21,740 eavesdropper here cannot simply decrypt the message because even though the 91 00:10:21,740 --> 00:10:26,810 eavesdropper has the encryption key, it does not have the decryption key and thus 92 00:10:26,810 --> 00:10:35,860 the eavesdropper cannot decrypt. Okay, however, this solution is still kind of 93 00:10:35,860 --> 00:10:41,920 risky. There is still a problem with this and it is we still have to make sure the 94 00:10:41,920 --> 00:10:49,620 keys are distributed in advance. Okay, so if we used this simple scheme where Bob is 95 00:10:49,620 --> 00:10:54,010 sending his public key to Alice, then there is a problem. If the attacker is not 96 00:10:54,010 --> 00:10:58,640 simply passively eavesdropping on the connection, but is willing to actively 97 00:10:58,640 --> 00:11:04,960 interfere with the connection. So, for example, the eavesdropper might intercept 98 00:11:04,960 --> 00:11:11,210 the public key that Bob is sending to Alice and replace it with his or her own 99 00:11:11,210 --> 00:11:19,290 public key. And then Alice would think that the key she received belongs to Bob 100 00:11:19,290 --> 00:11:24,131 and use this key to encrypt her message to Bob, and then suddenly the attacker can 101 00:11:24,131 --> 00:11:34,500 read the message again. So at this point, let's summarize about encryption, so 102 00:11:34,500 --> 00:11:41,680 encryption conceals the content of data. And this is pretty much what it does and 103 00:11:41,680 --> 00:11:48,880 pretty much the only thing that it does. In particular, it does not conceal the 104 00:11:48,880 --> 00:11:53,230 fact that there is communication going on. So an eavesdropper who is listening in on 105 00:11:53,230 --> 00:11:58,620 the connection obviously can see Alice sending a message to Bob. And thus the 106 00:11:58,620 --> 00:12:03,610 eavesdropper knows there is communication going on between Alice and Bob and this 107 00:12:03,610 --> 00:12:12,210 alone could be quite dangerous for Alice and Bob. So imagine if Alice was working 108 00:12:12,210 --> 00:12:19,760 for an intelligence agency and Bob was a journalist and the attacker sees Alice 109 00:12:19,760 --> 00:12:26,380 sending lots of documents to Bob, then this might be a strong indication that 110 00:12:26,380 --> 00:12:36,660 Alice is a whistleblower and Alice could be put into jail. So something more 111 00:12:36,660 --> 00:12:41,310 that is not concealed by encryption is the amount of data that is being 112 00:12:41,310 --> 00:12:48,820 exchanged. So if Alice is sending just a very short message to Bob then the 113 00:12:48,820 --> 00:12:56,866 eavesdropper can guess that the message that is being transferred is not a 20 GB 114 00:12:56,866 --> 00:13:02,740 file or something. So all this kind of metadata is something that encryption does 115 00:13:02,740 --> 00:13:10,040 not conceal. And there is a couple of more problems with encryption. One of them is 116 00:13:10,040 --> 00:13:16,110 that the attacker might change the message. Protecting from changes to the 117 00:13:16,110 --> 00:13:22,910 message is not the job of encryption. Another problem is that keys must be 118 00:13:22,910 --> 00:13:29,010 exchanged in advance, which I already talked about. And there's more problems. 119 00:13:29,010 --> 00:13:37,101 So, for example, an attacker might simply record a message when it is sent. And 120 00:13:37,101 --> 00:13:43,690 later, just replay this message to Bob. Or an attacker might go ahead and block a 121 00:13:43,690 --> 00:13:49,450 message altogether. So intercept the message and throw it into the trash to 122 00:13:49,450 --> 00:14:00,471 make sure it never arrives at Bob's site. Okay, and the first problem here, an 123 00:14:00,471 --> 00:14:06,230 attacker might change the message which actually leads me to the second part of my 124 00:14:06,230 --> 00:14:13,040 talk, which is authentication. So on my talk checklist, let's mark encryption as 125 00:14:13,040 --> 00:14:20,300 done. Okay, so now what is authentication? Authentication enables the detection of 126 00:14:20,300 --> 00:14:28,560 changes to data. It does not prevent changes from happening. It only enables 127 00:14:28,560 --> 00:14:40,060 the recipient to detect the changes after they have happened. OK. So, for example, 128 00:14:40,060 --> 00:14:46,570 or one example where something like authentication was needed is when Alice, 129 00:14:46,570 --> 00:14:52,160 um, when Bob was sending his public key to Alice, but this is by far not the only 130 00:14:52,160 --> 00:14:58,340 scenario where authentication is needed. So imagine if Alice is running a 131 00:14:58,340 --> 00:15:03,270 charitable organization. So, for example, she's saving refugees from drowning in the 132 00:15:03,270 --> 00:15:10,180 Mediterranean Sea. And Bob wants to donate to Alice to help her do that. Then Alice 133 00:15:10,180 --> 00:15:17,931 has to send her bank account number to Bob so that Bob can make the donation. And 134 00:15:17,931 --> 00:15:24,830 notice that in this scenario, the message that Alice is sending to Bob, her bank 135 00:15:24,830 --> 00:15:30,230 account number, is nothing that is secret. It does not have to be encrypted because 136 00:15:30,230 --> 00:15:36,310 this information is public knowledge. However, we do want to make sure that the 137 00:15:36,310 --> 00:15:46,890 message that arrives at Bob is indeed the correct bank account number. So to prevent 138 00:15:46,890 --> 00:15:54,339 something like this from happening where a criminal might intercept the message and 139 00:15:54,339 --> 00:15:59,100 replace the bank account number so Bob would send his money to the criminal's 140 00:15:59,100 --> 00:16:09,240 bank account instead of Alice's. And one way to realize authentication is, again, 141 00:16:09,240 --> 00:16:15,310 by having a pair of keys, one of them is used for authentication and one of them is 142 00:16:15,310 --> 00:16:21,736 used for verification. So checking if a message has been changed or not. And the 143 00:16:21,736 --> 00:16:26,130 authentication key must be kept secret. Thus it is called the secret key, whereas 144 00:16:26,130 --> 00:16:35,901 the verification key can be made public and it is called the public key. And now 145 00:16:35,901 --> 00:16:44,100 if you have a setup like this, then Alice can go ahead and take the message that she 146 00:16:44,100 --> 00:16:52,600 wants to send to Bob and apply some computation to it together with the secret 147 00:16:52,600 --> 00:16:58,100 key, the authentication key to produce something that we call a signature or a 148 00:16:58,100 --> 00:17:06,360 digital signature. And then Alice sends this signature over to Bob along with her 149 00:17:06,360 --> 00:17:12,730 bank account number. And Bob will take the signature that he receives and the bank 150 00:17:12,730 --> 00:17:21,500 account number he receives and apply some kind of computation to them to... and this 151 00:17:21,500 --> 00:17:29,900 computation will determine if the bank account number has been changed or is in 152 00:17:29,900 --> 00:17:37,790 fact original. So if the attacker changes the bank account number, then Bob will be 153 00:17:37,790 --> 00:17:43,470 able to detect this change by checking the signature. And this holds even if the 154 00:17:43,470 --> 00:17:47,250 attacker does not only change the bank account number, but also the signature. 155 00:17:47,250 --> 00:17:53,640 Okay? So these things are designed in a way which hopefully makes it impossible 156 00:17:53,640 --> 00:17:59,640 for any attacker to come up with a valid signature for anything else than the 157 00:17:59,640 --> 00:18:11,230 original message. OK. So the only thing that Bob will ...the only way that Bob 158 00:18:11,230 --> 00:18:18,250 will accept the signature is if the attacker does not, in fact, change the 159 00:18:18,250 --> 00:18:25,470 bank account number. And in this case, it is safe for Bob to transfer the money. OK, 160 00:18:25,470 --> 00:18:34,860 but here. OK. So here is a different solution to this problem. And it's 161 00:18:34,860 --> 00:18:40,730 actually pretty much the same, except that now we have just a single key which is 162 00:18:40,730 --> 00:18:49,090 used for both authentication and verification. And in this case, things 163 00:18:49,090 --> 00:18:53,700 simply have a different name. They work in exactly the same way, except that the 164 00:18:53,700 --> 00:19:00,770 signature is called a message authentication code or a MAC for short. 165 00:19:00,770 --> 00:19:09,520 OK. But in both of these scenarios, whether we have two distinct keys or just 166 00:19:09,520 --> 00:19:20,380 one key, we still have the problem of key distribution. OK. So imagine if in the 167 00:19:20,380 --> 00:19:26,630 segment a scenario with two keys, Alice was sending her public key to Bob, then we 168 00:19:26,630 --> 00:19:31,840 would have the same attack as before, namely the attacker could just go ahead 169 00:19:31,840 --> 00:19:40,460 and change the key that Alice is sending to Bob and exchange it for his own key. 170 00:19:40,460 --> 00:19:47,650 And so if the attacker is sending his own public key, his only… his own verification 171 00:19:47,650 --> 00:19:54,250 key to Bob, then obviously the attacker can create a valid signature for his 172 00:19:54,250 --> 00:20:03,660 forged bank account number. And Bob would accept this. OK, so again, 173 00:20:03,660 --> 00:20:09,040 we have this problem of key distribution, which is that the verification key must be 174 00:20:09,040 --> 00:20:21,130 known to Bob in advance. OK. And this leads me to my next... the next section of 175 00:20:21,130 --> 00:20:30,037 my talk. So let's mark authentication as done and go on with certificates. So a 176 00:20:30,037 --> 00:20:36,270 certificate is a document that confirms that a specific public key belongs to a 177 00:20:36,270 --> 00:20:45,113 specific entity, for example a person or an organization. And if we want to use 178 00:20:45,113 --> 00:20:51,211 certificates, let's just go back to the sender scenario we had before. So Alice 179 00:20:51,211 --> 00:20:57,200 wants to send her bank account number, her public key and a signature for her 180 00:20:57,200 --> 00:21:04,270 bank account number to Bob and an attacker might change the public key and the bank 181 00:21:04,270 --> 00:21:09,590 account number and the signature. And now if we add certificates into this, we need 182 00:21:09,590 --> 00:21:15,730 to add something that we call a certificate authority. And this is a 183 00:21:15,730 --> 00:21:25,100 trusted third party, which will create certificates which confirm the association 184 00:21:25,100 --> 00:21:32,860 between a person and a public key. So before Alice is sending a message to Bob, 185 00:21:32,860 --> 00:21:38,300 she will walk up to the certificate authority and say "Hey certificate 186 00:21:38,300 --> 00:21:42,931 authority. This is my public key. I'm Alice. Please give me a certificate." 187 00:21:42,931 --> 00:21:49,760 And then the certification authority will check that Alice is indeed Alice and that 188 00:21:49,760 --> 00:21:59,920 Alice indeed owns this public key. And if Alice passes these checks, then the 189 00:21:59,920 --> 00:22:06,800 certification authority will create a certificate and hand that to Alice, and 190 00:22:06,800 --> 00:22:13,170 the certificate is just a document which says that the certification authority has 191 00:22:13,170 --> 00:22:20,878 verified that the silvery key here on the slides belongs to Alice. And now once 192 00:22:20,878 --> 00:22:30,860 Alice has the certificate, she can just send her public key to Bob together with a 193 00:22:30,860 --> 00:22:37,840 certificate. And then Bob, if he knows the certificate authority's public key can 194 00:22:37,840 --> 00:22:44,280 check that the certificate is indeed correct. So it was indeed created by the 195 00:22:44,280 --> 00:22:50,220 certificate authority. And if he trusts the certificate authority, he will know 196 00:22:50,220 --> 00:22:55,980 that the silvery key is, in fact, Alice's. And then afterwards, Bob will be convinced 197 00:22:55,980 --> 00:23:03,480 that the silvery key is Alice's and he can check the message that Alice is sending to 198 00:23:03,480 --> 00:23:21,110 Bob and make sure it has not been changed. OK. So we're not completely free from the 199 00:23:21,110 --> 00:23:30,299 key distribution problem yet, however, because still Bob has to know the public 200 00:23:30,299 --> 00:23:35,330 key of the certification authority in advance. Okay. So Bob does not need to 201 00:23:35,330 --> 00:23:40,240 know Alice's public key in advance, but he needs to know the public key of the 202 00:23:40,240 --> 00:23:47,360 certification authority in advance. And in practice, there's not just a single 203 00:23:47,360 --> 00:23:51,940 certification authority, but there's a whole bunch of them and certification 204 00:23:51,940 --> 00:24:00,400 authorities can even create certificates for other certification authorities and so 205 00:24:00,400 --> 00:24:07,110 on. So now Bob does not have to know all the public keys of everyone he's 206 00:24:07,110 --> 00:24:16,210 communicating with, but he only has to know the public keys of a couple of 207 00:24:16,210 --> 00:24:26,350 certification authorities. Okay, so let's summarize about certificates. So as I said 208 00:24:26,350 --> 00:24:31,809 before, certificates confirm that a specific public key belongs to a specific 209 00:24:31,809 --> 00:24:38,049 entity like a person or an organization. But we're still not completely free from 210 00:24:38,049 --> 00:24:43,710 the key distribution problem because people have to know the certificate 211 00:24:43,710 --> 00:24:50,679 authority's public keys. And another problem here is that this scheme gives an 212 00:24:50,679 --> 00:24:56,679 enormous amount of power to a certification authority. So if an attacker 213 00:24:56,679 --> 00:25:06,140 can compromise a certification authority, then he could force the certification 214 00:25:06,140 --> 00:25:14,570 authority to create fake certificates, connecting fake keys to real identities. 215 00:25:14,570 --> 00:25:22,901 OK so he could create a fake certificate, which says that the certification 216 00:25:22,901 --> 00:25:31,680 authority has checked that the attackers public key belongs to Alice. And fixing 217 00:25:31,680 --> 00:25:38,260 this problem about the certification authorities' power is something that 218 00:25:38,260 --> 00:25:46,507 cryptographers are still working on. So that's still a problem today. And in fact, 219 00:25:46,507 --> 00:25:50,280 this problem is not just theoretical. There is a number of incidents that have 220 00:25:50,280 --> 00:25:56,320 happened with certification authorities. So one famous example is the "Diginotar" 221 00:25:56,320 --> 00:26:05,260 case where in fact a certification authority named Diginotar was hacked and 222 00:26:05,260 --> 00:26:12,490 the attackers created a fake certificate for a google.com domain or one of the 223 00:26:12,490 --> 00:26:17,520 other Google domains. I don't exactly remember. And then these certificates 224 00:26:17,520 --> 00:26:23,527 showed up being used in Iran. OK, so this is not just a theoretical problem. This 225 00:26:23,527 --> 00:26:34,578 has, in fact, happened before. OK, so this concludes what I wanted to say about 226 00:26:34,578 --> 00:26:43,700 certificates. So let's move on and see how these things can be put together to build 227 00:26:43,700 --> 00:26:52,100 more complex but also more useful tools. So one of the tools I want to introduce is 228 00:26:52,100 --> 00:26:57,440 called authenticated encryption and it's basically a combination of encryption and 229 00:26:57,440 --> 00:27:03,440 authentication. So for some reason, people use this phrase mostly in the symmetric 230 00:27:03,440 --> 00:27:08,679 case where there is one key for encryption and decryption and one key for 231 00:27:08,679 --> 00:27:17,320 authentication and for verification. But you could pretty much recreate the 232 00:27:17,320 --> 00:27:24,710 same scheme in an asymmetric fashion. And that is also being done in practice. 233 00:27:24,710 --> 00:27:30,299 In this case, people just don't call it authenticated encryption. So one way to 234 00:27:30,299 --> 00:27:36,570 build authenticated encryption is so if Alice wants to send a message to Bob, then 235 00:27:36,570 --> 00:27:42,370 she will encrypt the message using the encryption key and send the cipher text 236 00:27:42,370 --> 00:27:50,510 over to Bob. And then she will use a copy of the cipher text and compute a message 237 00:27:50,510 --> 00:27:59,530 authentication code from it using the second key that she has. And then, Bob, as 238 00:27:59,530 --> 00:28:07,014 Alice is going to send over that message authentication code to Bob, too. And now 239 00:28:07,014 --> 00:28:17,240 Bob can decrypt the message using the key he has. And additionally Bob can check if 240 00:28:17,240 --> 00:28:24,080 this message has been changed or whether it is original by using the verification 241 00:28:24,080 --> 00:28:31,789 procedure. OK, so again, this kind of authentication does not prevent changes 242 00:28:31,789 --> 00:28:39,500 from happening. But Bob can check whether a change has happened. And in fact, this 243 00:28:39,500 --> 00:28:46,330 kind of authenticated encryption can actually boost the security of the 244 00:28:46,330 --> 00:28:56,124 encryption scheme. OK, so another thing I wanted to talk about is called hybrid 245 00:28:56,124 --> 00:29:01,860 encryption. And this is the combination of symmetric encryption and asymmetric 246 00:29:01,860 --> 00:29:09,929 encryption. And the reason why this is interesting is that asymmetric encryption 247 00:29:09,929 --> 00:29:17,400 is usually quite slow compared to symmetric encryption. So if you wanted to 248 00:29:17,400 --> 00:29:23,830 send a very long message to Bob and you only had a public key encryption scheme, 249 00:29:23,830 --> 00:29:29,340 so an asymmetric encryption scheme, then it would take a very long time to 250 00:29:29,340 --> 00:29:36,919 encrypt the message and to decrypt the message. So however, you can combine 251 00:29:36,919 --> 00:29:40,870 asymmetric encryption and symmetric encryption in a way that makes the 252 00:29:40,870 --> 00:29:46,580 encryption process faster, and the way you do this is so if Alice wants to send a 253 00:29:46,580 --> 00:29:53,970 message to Bob, Alice first generates a new key for the symmetric encryption 254 00:29:53,970 --> 00:30:03,491 scheme. And Alice will encrypt her message with this key and send the cipher text 255 00:30:03,491 --> 00:30:08,880 over to Bob. And afterwards, Alice will take the symmetric key that she has just 256 00:30:08,880 --> 00:30:16,790 generated and encrypt this key with Bob's public key, and then that is sent over to 257 00:30:16,790 --> 00:30:25,730 Bob as well. And now Bob can decrypt the symmetric key using his secret decryption 258 00:30:25,730 --> 00:30:32,772 key - the kind of golden one here on the slides - to recover the symmetric key. And 259 00:30:32,772 --> 00:30:41,559 afterwards, Bob can use the freshly recovered symmetric key to decrypt the 260 00:30:41,559 --> 00:30:48,620 actual message. However, an eavesdropper listening in on the connection cannot 261 00:30:48,620 --> 00:30:53,549 decrypt the message because it does not have the symmetric key and it cannot 262 00:30:53,549 --> 00:31:05,620 decrypt the symmetric key because it does not have Bob's secret decryption key. OK. 263 00:31:05,620 --> 00:31:13,539 So you can continue to build on these kind of things and what you end up with is 264 00:31:13,539 --> 00:31:19,740 something called transport layer security or TLS for short. And transport layer 265 00:31:19,740 --> 00:31:25,669 security is a network protocol that combines much of the things that I've 266 00:31:25,669 --> 00:31:32,470 introduced so far. So it combines encryption either symmetric or hybrid and 267 00:31:32,470 --> 00:31:38,340 it combines it with authentication. So MACs and signatures and certificates and 268 00:31:38,340 --> 00:31:48,450 all the other things. And it adds in a couple of more things to detect replays of 269 00:31:48,450 --> 00:31:53,750 messages. So if an attacker was to simply replay a message recorded earlier, this is 270 00:31:53,750 --> 00:32:00,710 something that TLS can detect. And TLS can also detect if a message has been 271 00:32:00,710 --> 00:32:11,919 suppressed by the attacker. So within a connection. So what TLS does is it kind of 272 00:32:11,919 --> 00:32:21,679 establishes a secure connection between two entities. So say Alice and Bob over an 273 00:32:21,679 --> 00:32:35,370 insecure network which is controlled by the attacker. And one application where 274 00:32:35,370 --> 00:32:40,590 TLS is commonly used is for sending e-mails. So, for example, when you're 275 00:32:40,590 --> 00:32:46,970 sending an email from, say, Alice wants to send an email to Bob, then the e-mail is 276 00:32:46,970 --> 00:32:52,490 usually not sent directly. But Alice sends the message to her own email server and 277 00:32:52,490 --> 00:32:57,855 then Alice's e-mail server will forward this email to Bob's e-mail server. And 278 00:32:57,855 --> 00:33:04,520 when Bob goes online and checks his e-mails, the e-mail will be downloaded to 279 00:33:04,520 --> 00:33:11,419 his device like his phone or desktop computer or whatever device Bob is using. 280 00:33:11,419 --> 00:33:19,750 And while Alice can make sure that when she's uploading her message, her e-mail to 281 00:33:19,750 --> 00:33:26,280 her own server, that this connection is secure just by encrypting the message. So 282 00:33:26,280 --> 00:33:32,870 essentially using TLS and all the things that involves like encrypting the message 283 00:33:32,870 --> 00:33:42,510 and authenticating the message and so on. However, Alice cannot check if her own 284 00:33:42,510 --> 00:33:48,730 e-mail server also uses a secure connection to forward this e-mail to Bob's 285 00:33:48,730 --> 00:34:02,740 server. OK. So let us take a more detailed look here. So each of these green locks 286 00:34:02,740 --> 00:34:14,169 signifies a secure connection. This means that when a message is sent then or each 287 00:34:14,169 --> 00:34:18,789 time a message is sent over a secure connection, there is some encryption and 288 00:34:18,789 --> 00:34:23,980 authentication going on on the sender side and some decryption and verification going 289 00:34:23,980 --> 00:34:29,649 on on the receiving side. OK, so if Alice wants to send an e-mail to Bob, then Alice 290 00:34:29,649 --> 00:34:35,669 will build up a secure connection and send the email over it. And this will involve 291 00:34:35,669 --> 00:34:39,249 encrypting the e-mail and authenticating the e-mail. And Alice's server will 292 00:34:39,249 --> 00:34:44,569 decrypt the e-mail and verify that this has not been changed. And then Alice's 293 00:34:44,569 --> 00:34:48,789 server will forward the e-mail to Bob's server, which involves, again, encrypting 294 00:34:48,789 --> 00:34:55,799 it and authenticating it. And Bob's server will decrypt and verify it. And then 295 00:34:55,799 --> 00:35:03,819 again, the same process repeats when the e-mail is sent or is downloaded by Bob 296 00:35:03,819 --> 00:35:13,059 from his server. However, in this case so even though the message is encrypted every 297 00:35:13,059 --> 00:35:18,979 time it is sent over a network, it is known in plain text by Alice's server and 298 00:35:18,979 --> 00:35:23,420 Bob's server. Right. Because Alice is sending the message. So she is encrypting 299 00:35:23,420 --> 00:35:28,500 it and Alice's server is decrypting it. So Alice's server can read the message in 300 00:35:28,500 --> 00:35:34,320 plain text. And the same goes for Bob's server. And this is what we call transport 301 00:35:34,320 --> 00:35:43,510 encryption because the e-mail is encrypted every time it is being sent over network. 302 00:35:43,510 --> 00:35:52,460 And a concept opposed to this is what we call end to end encryption, where Alice, 303 00:35:52,460 --> 00:35:58,619 before sending the message, Alice encrypts it, but not with a key that is known to 304 00:35:58,619 --> 00:36:04,839 her server, but directly with Bob's public key. And she might even sign it with her 305 00:36:04,839 --> 00:36:12,400 own secret authentication key. And then Alice sends this already encrypted 306 00:36:12,400 --> 00:36:17,950 message over a secure channel to her own server, which involves encrypting the 307 00:36:17,950 --> 00:36:24,849 message again and authenticating it again, and then Alice's server will decrypt the 308 00:36:24,849 --> 00:36:33,739 message and verify that it has not been changed, however, the server cannot remove 309 00:36:33,739 --> 00:36:38,359 the second layer of encryption, right? So the e-mail is encrypted two times. One 310 00:36:38,359 --> 00:36:47,434 time was with Bob's key and a second time so that the server can decrypt it. And now 311 00:36:47,434 --> 00:36:51,470 the server can remove the second encryption. But the first one is still 312 00:36:51,470 --> 00:36:58,660 there. So, Alice's server cannot read the e-mail. And then the process repeats, 313 00:36:58,660 --> 00:37:04,650 the already encrypted message is encrypted a second time and decrypted again at Bob's 314 00:37:04,650 --> 00:37:09,930 server and then it is when it is downloaded by Bob it is encrypted again 315 00:37:09,930 --> 00:37:18,400 and decrypted again. And then finally, Bob, who has the secret key, the secret 316 00:37:18,400 --> 00:37:24,390 decryption key can remove the inner layer of encryption. And so Bob can read the 317 00:37:24,390 --> 00:37:31,839 message. However, the servers in between cannot read the message just because it is 318 00:37:31,839 --> 00:37:44,969 still encrypted with Bob's public key. OK, so with that, I would like to wrap up. 319 00:37:44,969 --> 00:37:51,549 Sorry. So I would like to wrap up. So here's a couple of take home messages. So 320 00:37:51,549 --> 00:37:57,210 the first one is encryption conceals the content of data. And that's pretty much 321 00:37:57,210 --> 00:38:02,740 all it does. It does not conceal the metadata and it does not prevent the 322 00:38:02,740 --> 00:38:08,220 message that is being sent from being changed. That is the job of 323 00:38:08,220 --> 00:38:15,599 authentication. Authentication enables the detection of changes to data and both for 324 00:38:15,599 --> 00:38:21,950 encryption and for authentication. You need to have pre-shared keys or are maybe 325 00:38:21,950 --> 00:38:32,080 not really pre-shared keys, but key distribution has to happen beforehand. And 326 00:38:32,080 --> 00:38:38,099 one way to make this problem of key distribution simpler is with certificates. 327 00:38:38,099 --> 00:38:47,420 So certificates confirm that a specific public key is owned by a specific entity. 328 00:38:47,420 --> 00:38:50,610 And if you have all these things, encryption and authentication and 329 00:38:50,610 --> 00:38:57,660 certificates, you can build a network protocol which takes care of securely 330 00:38:57,660 --> 00:39:02,749 transmitting a message from one place to another. And you can apply that to get 331 00:39:02,749 --> 00:39:09,619 transport encryption. But transport encryption is inferior to end to end 332 00:39:09,619 --> 00:39:13,829 encryption in the sense that with transport encryption, all the 333 00:39:13,829 --> 00:39:21,160 intermediaries can still read the e-mail or the message being sent. However, with 334 00:39:21,160 --> 00:39:28,589 end to end encryption, they cannot. And with that, I'd like to close and that I 335 00:39:28,589 --> 00:39:35,150 will be happy to answer your questions if there are any questions which you cannot 336 00:39:35,150 --> 00:39:40,940 ask today you can send me an email at this email address on the slides. I will try to 337 00:39:40,940 --> 00:39:45,334 keep that email address open for one or two years. 338 00:39:45,334 --> 00:39:55,429 *applause* 339 00:39:55,429 --> 00:40:00,790 Herald: Thank you for your talk. And now we would come to the question part, if you 340 00:40:00,790 --> 00:40:07,760 have any question, you can come up to microphones in the middle of the rows. Are 341 00:40:07,760 --> 00:40:21,249 there any questions from the Internet? We have plenty of time. If anyone comes up. 342 00:40:21,249 --> 00:40:27,960 With a question, you're invited. We have a question on microphone two, please. 343 00:40:27,960 --> 00:40:32,779 Mic 2: Thanks for your good talk, and I would like to know how can you change a 344 00:40:32,779 --> 00:40:39,729 message that was properly decrypted without the other receiving part noticing 345 00:40:39,729 --> 00:40:45,779 that the decryption doesn't work anymore? Oots: That depends on the encryption 346 00:40:45,779 --> 00:40:52,430 scheme that you're using. But for quite a number of encryption schemes, changing the 347 00:40:52,430 --> 00:40:58,359 message is actually quite simple. So there is a really large number of encryption 348 00:40:58,359 --> 00:41:07,190 schemes which just work by changing a couple of bits. Okay. So the message is 349 00:41:07,190 --> 00:41:15,400 made up of bits and your encryption scheme gives you a way to determine which of the 350 00:41:15,400 --> 00:41:21,849 bits to change and which not to change. So when you were encrypting, you use the 351 00:41:21,849 --> 00:41:26,319 encryption scheme to figure out which bits must be flipped. So change from zero to 352 00:41:26,319 --> 00:41:34,740 one or from one to zero and you just apply this bit change to the message that is 353 00:41:34,740 --> 00:41:46,329 being sent and then the receiver can just undo this change to recover that original 354 00:41:46,329 --> 00:41:52,839 message, whereas an attacker who does not know which bits have been flipped cannot. 355 00:41:52,839 --> 00:42:01,490 However. So still the attacker can just flip a couple of the bits, and in this 356 00:42:01,490 --> 00:42:10,619 case, so say the bit has been flipped by Alice and it is being flipped another time 357 00:42:10,619 --> 00:42:17,009 by the attacker. So the bit is at its original value again. And then Bob, who 358 00:42:17,009 --> 00:42:21,849 knows how to decrypt, will flip in another time. So it's changed again. And thus the 359 00:42:21,849 --> 00:42:27,900 message has been changed. And there's a couple of things you can do with this kind 360 00:42:27,900 --> 00:42:35,079 of with this kind of changes to the messages so that the decryption simply 361 00:42:35,079 --> 00:42:40,265 does not fail. It just maybe outputs the wrong message. 362 00:42:40,265 --> 00:42:43,960 Herald: OK. Next question from microphone six, please. 363 00:42:43,960 --> 00:42:50,760 Mic 6: You stated that encryption does not cover metadata. Are there any thoughts 364 00:42:50,760 --> 00:42:53,459 about that? Oots: Any thoughts? 365 00:42:53,459 --> 00:43:00,309 Mic 6: Yeah. Any solution for maybe encrypting metadata? I don't know. 366 00:43:00,309 --> 00:43:11,749 Oots: So much of this is pretty hard to come by. So, I mean, for emails, there is 367 00:43:11,749 --> 00:43:19,059 the idea of also encrypting the subject, which is usually not encrypted, however. 368 00:43:19,059 --> 00:43:27,049 So if you want to hide the length of the message, what you can do is simply pad the 369 00:43:27,049 --> 00:43:33,110 message. So just end random garbage at the end of it to kind of hide how long it is. 370 00:43:33,110 --> 00:43:40,730 Exactly. So the attacker will still have an upper bound on your message length, but 371 00:43:40,730 --> 00:43:46,160 it does not, so it knows that the message you're sending is that most as long as the 372 00:43:46,160 --> 00:43:54,130 cipher texts you're sending. But it doesn't know maybe it's shorter. So if you 373 00:43:54,130 --> 00:43:58,729 want to hide your identity while communicating, you should be going for 374 00:43:58,729 --> 00:44:05,279 something like Tor where you're not connecting directly to the person you want 375 00:44:05,279 --> 00:44:10,119 to communicate with, but via a couple of intermediaries in a way that none of the 376 00:44:10,119 --> 00:44:15,521 intermediaries know both you and the final recipient. 377 00:44:15,521 --> 00:44:17,639 Mic 6: Thank you 378 00:44:17,639 --> 00:44:21,279 Herald: Okay, then I believe we had a question from the Internet. 379 00:44:21,279 --> 00:44:26,369 Internet: Yes, uh, the Internet is asking, can you say anything about the additional 380 00:44:26,369 --> 00:44:31,690 power consumption of the encryption layers on a global scale? 381 00:44:31,690 --> 00:44:41,670 Oots: Sadly, I think I can not. So I do not exactly know how much power 382 00:44:41,670 --> 00:44:51,310 consumption is caused by the encryption, however. So in terms of computation, at 383 00:44:51,310 --> 00:44:58,630 least, symmetric encryption is quite cheap in the sense that it takes a couple of 384 00:44:58,630 --> 00:45:05,589 processor cycles to decrypt. I don't know. Sixteen blocks or something. So you 385 00:45:05,589 --> 00:45:12,094 can usually decrypt hundreds of megabytes per second with a processor. At least with 386 00:45:12,094 --> 00:45:26,380 a modern one. So I don't know any numbers, but I mean you can guess that if everyone 387 00:45:26,380 --> 00:45:31,239 in the first world countries is using encryption, then in the sum there is a 388 00:45:31,239 --> 00:45:36,069 pretty large amount of energy going into it. 389 00:45:36,069 --> 00:45:38,470 Herald: Next question. Microphone two, please. 390 00:45:38,470 --> 00:45:44,599 Mic 2: You mentioned a couple of times that an attacker might be able to replay a 391 00:45:44,599 --> 00:45:50,720 message. I haven't really understood. If I'm an attacker, how does this benefit me 392 00:45:50,720 --> 00:46:01,950 that I am able to do that? Oots: So imagine if Alice is sending an 393 00:46:01,950 --> 00:46:08,859 order to her bank to transfer some money. And every time the bank is receiving such 394 00:46:08,859 --> 00:46:14,510 an order, it will initiate the bank transfer. Then as an attacker, that would 395 00:46:14,510 --> 00:46:22,170 be pretty cool to exploit because you once eavesdrop on Alice sending such an order 396 00:46:22,170 --> 00:46:29,130 to the bank and then later on you can just repeat the same order to the bank and more 397 00:46:29,130 --> 00:46:33,690 money will be sent. So if you were the recipient, that would be pretty cool and 398 00:46:33,690 --> 00:46:38,350 you could just deplete Alice's bank account. 399 00:46:38,350 --> 00:46:42,000 Herald: Then a question from microphone three. 400 00:46:42,000 --> 00:46:48,309 Mic 3: I was in a talk about elliptic curve cryptography, and I'm wondering 401 00:46:48,309 --> 00:46:54,229 where it is would we apply now in your example or in your process you showed us. 402 00:46:54,229 --> 00:47:09,769 Oots: Um, let me maybe just go to another slide. Um. So typically encryption is 403 00:47:09,769 --> 00:47:16,380 applied or elliptic curves are applied within these encryption and decryption 404 00:47:16,380 --> 00:47:20,079 boxes. Okay. So there is a lot of mathematics going on in these 405 00:47:20,079 --> 00:47:25,989 computations, which I did not explain because I wanted to keep this talk 406 00:47:25,989 --> 00:47:31,190 accessible to a broad audience. But one way to realize such an encryption 407 00:47:31,190 --> 00:47:38,700 procedure is by using elliptic curves within these boxes. 408 00:47:38,700 --> 00:47:44,949 Herold: OK. Microphone one, please. Mic 1: Another limitation I could think of 409 00:47:44,949 --> 00:47:51,359 or how to overcome this is devices like IOT devices that have low power and 410 00:47:51,359 --> 00:47:59,480 limited processing capability. So how do you adapt this complex encryption? The 411 00:47:59,480 --> 00:48:06,279 encryption competition for those devices? Oots: There is some research going on on 412 00:48:06,279 --> 00:48:14,369 encryption schemes that are particularly lightweight, so particularly suited for 413 00:48:14,369 --> 00:48:21,709 resource constrained devices. But as far as I know, pretty much all of them have 414 00:48:21,709 --> 00:48:28,380 some weaknesses that came out of them. So they security wise, they do not offer the 415 00:48:28,380 --> 00:48:34,839 same guarantees as the ones that you use if you have the resources for it. 416 00:48:34,839 --> 00:48:40,319 Herald: On microphone two, please. Mic 2: Yeah. Hi. You mentioned the 417 00:48:40,319 --> 00:48:43,559 enormous power that certificate authorities have in the picture of 418 00:48:43,559 --> 00:48:49,299 certification and authentication. I was wondering if there I mean, what are the 419 00:48:49,299 --> 00:48:53,749 possible solution or the proposed solution at the moment? What is the state of the 420 00:48:53,749 --> 00:48:59,519 art on solving that problem? Oots: So one solution that is currently 421 00:48:59,519 --> 00:49:05,989 being pushed is called certificate transparency. And that works by basically 422 00:49:05,989 --> 00:49:11,970 creating a public or lots of public log files where each certificate that is ever 423 00:49:11,970 --> 00:49:19,089 created must be added to the log file. And so if you're Google dot com or if you are 424 00:49:19,089 --> 00:49:25,819 Google and you see someone entering Google's certificate into the log file and 425 00:49:25,819 --> 00:49:30,829 you know that you didn't ask for the certificate, then you know that there is a 426 00:49:30,829 --> 00:49:35,959 fake certificate out there. And so whenever you get a certificate, you are 427 00:49:35,959 --> 00:49:40,950 expected to check if the certificate is actually contained in one of the public 428 00:49:40,950 --> 00:49:49,859 log files. Does that answer the question? Mic 2: Yes. But how does appending a 429 00:49:49,859 --> 00:49:56,150 certificate would work? So, for example, making sure the certificate is recognized 430 00:49:56,150 --> 00:50:01,459 Alice as legitimate. Oots: OK. So the idea is whenever you get 431 00:50:01,459 --> 00:50:06,930 a certificate, it will be put in a log file and everyone who gets the certificate 432 00:50:06,930 --> 00:50:10,329 is expected to check that the log file is actually there. 433 00:50:10,329 --> 00:50:15,180 Mic 2: So it is the certificate authority that also pushes the certificate to the 434 00:50:15,180 --> 00:50:17,950 log? Oots: That's how it's expected to work. 435 00:50:17,950 --> 00:50:20,330 Yes. Mic 2: OK. Thank you. 436 00:50:20,330 --> 00:50:23,170 Oots: You're welcome. Herold: Then we have one more question 437 00:50:23,170 --> 00:50:27,559 from the Internet. Internet: The Internet wants to know can 438 00:50:27,559 --> 00:50:34,619 we or where can we get an authentication for a PGP key and how to apply it on a key 439 00:50:34,619 --> 00:50:39,390 afterwards. Is there a possibility somehow? 440 00:50:39,390 --> 00:50:47,599 Oots: I guess that depends. So with PGP, the common model is that there is not a or 441 00:50:47,599 --> 00:50:56,519 there is not a central certification authority or a bunch of them, but so you 442 00:50:56,519 --> 00:51:01,689 have kind of a social graph of people who know each other and exchange e-mails. And 443 00:51:01,689 --> 00:51:10,109 each of these e-mails, each of these users should authenticate the public keys of 444 00:51:10,109 --> 00:51:16,369 their peers. OK? So when you want to communicate with someone who you do not 445 00:51:16,369 --> 00:51:22,359 already know, but who's maybe a friend of a friend of yours, then hopefully your 446 00:51:22,359 --> 00:51:31,519 friend will have authenticated the public key of his friend. And if you trust your 447 00:51:31,519 --> 00:51:40,960 friend, you can then check that your friend, in fact, has created a kind of a 448 00:51:40,960 --> 00:51:49,829 certificate for his friend. Herald: Are there more questions from the 449 00:51:49,829 --> 00:51:56,709 Internet? One more? Yes, please. Internet: I don't know if it's a question 450 00:51:56,709 --> 00:52:03,440 regarding your talk, really, but someone wants to know. Would you recommend 451 00:52:03,440 --> 00:52:14,839 startTLS and SSL/TLS in e-mail? Oots: So as far as I'm concerned, I would 452 00:52:14,839 --> 00:52:22,890 always opt for using encryption on the outermost layer. So first, building a 453 00:52:22,890 --> 00:52:31,499 secure connection to your email server and then doing SMTP or whatever over that 454 00:52:31,499 --> 00:52:38,209 connection. So I think directly establishing the connection in a secure 455 00:52:38,209 --> 00:52:46,846 manner is as is a better way to do it than with startTLS. 456 00:52:46,846 --> 00:52:51,359 Herold: I believe that what is it, was it for questions? Please have a last round of 457 00:52:51,359 --> 00:52:52,939 applause for Oots! 458 00:52:52,939 --> 00:52:58,387 *applause* 459 00:52:58,387 --> 00:53:12,751 *36c3 postroll music* 460 00:53:12,751 --> 00:53:24,689 Subtitles created by c3subtitles.de in the year 2020. Join, and help us!