1 00:00:00,780 --> 00:00:03,500 At a time when injustice is growing, 2 00:00:03,500 --> 00:00:06,060 populists and racists are on the rise - 3 00:00:06,060 --> 00:00:08,160 as we are also trying to 4 00:00:08,160 --> 00:00:11,800 combat discrimination based on appearance and origin, 5 00:00:11,800 --> 00:00:14,720 the EU supports the research and use 6 00:00:14,720 --> 00:00:18,100 of technology that further intensifies these problems: 7 00:00:18,100 --> 00:00:21,540 Biometric mass surveillance systems. 8 00:00:21,540 --> 00:00:24,780 The hope associated with this is understandable: 9 00:00:24,780 --> 00:00:27,020 More security, more efficiency - 10 00:00:27,020 --> 00:00:29,020 less crime! 11 00:00:29,020 --> 00:00:32,300 But from a scientific point of view, it is not justified: 12 00:00:32,300 --> 00:00:37,080 Because every "artificial intelligence" has to be trained with selected data. 13 00:00:37,080 --> 00:00:41,000 So it always learns the unconscious biases of its programmers. 14 00:00:41,000 --> 00:00:43,540 In practice they are self-reinforcing - 15 00:00:43,540 --> 00:00:46,600 fully automated and incredibly fast: 16 00:00:46,600 --> 00:00:49,460 "These people cause problems quite often. 17 00:00:49,460 --> 00:00:51,800 They need to be watched more closely. 18 00:00:51,800 --> 00:00:53,800 These people cause problems quite often. 19 00:00:53,800 --> 00:00:55,160 They need to be watched more closely..." 20 00:01:00,780 --> 00:01:02,100 And if you are thinking: 21 00:01:02,100 --> 00:01:05,280 "Oh, that doesn't concern me - I have nothing to hide..." 22 00:01:05,280 --> 00:01:08,500 it is worth looking at how this typically progresses: 23 00:01:08,500 --> 00:01:10,500 Even harmless-looking systems, 24 00:01:10,500 --> 00:01:14,680 which were initially intended only for convenient selective identification, 25 00:01:14,680 --> 00:01:18,360 are subsequently also used for large-scale monitoring. 26 00:01:18,360 --> 00:01:20,720 It makes sense: "If the technology exists, 27 00:01:20,720 --> 00:01:22,220 it was expensive to purchase 28 00:01:22,220 --> 00:01:24,440 And it can recognize suspects..." 29 00:01:24,440 --> 00:01:29,000 Up until now, virtually every system that was potentially capable of surveillance 30 00:01:29,000 --> 00:01:31,760 has sooner or later been used for it. 31 00:01:31,760 --> 00:01:35,140 And with biometric data, it becomes especially dangerous: 32 00:01:35,140 --> 00:01:38,220 For example, even without your knowledge 33 00:01:38,220 --> 00:01:39,780 your face can end up in databases, 34 00:01:39,780 --> 00:01:44,700 which can always identify you everywhere -- completely automatically. 35 00:01:44,700 --> 00:01:48,780 If your data is then shared or sold on and linked with other data, 36 00:01:48,780 --> 00:01:52,320 it could have a devastating impact on your life. 37 00:01:52,320 --> 00:01:55,920 Your creditworthiness, for example, could be restricted. 38 00:01:55,920 --> 00:01:58,160 You could lose just your freedom to travel.... 39 00:01:58,160 --> 00:02:00,820 or you could lose all your freedom. 40 00:02:00,820 --> 00:02:03,060 But even if a perfect, 41 00:02:03,060 --> 00:02:07,920 safe, fair and completely neutral monitoring system existed: 42 00:02:07,920 --> 00:02:10,140 There would always be a side effect. 43 00:02:10,140 --> 00:02:12,860 The so-called "chilling effect": 44 00:02:12,860 --> 00:02:15,840 According to this effect, people who feel that they are being watched 45 00:02:15,840 --> 00:02:17,840 automatically behave in a more conformist way, 46 00:02:17,840 --> 00:02:19,840 because they want to avoid attracting attention. 47 00:02:19,840 --> 00:02:22,300 As a result, resistance to injustice 48 00:02:22,300 --> 00:02:25,880 and solidarity within society decrease. 49 00:02:25,880 --> 00:02:27,880 With far-reaching consequences. 50 00:02:28,580 --> 00:02:33,200 But the EU has the opportunity to stop this trend now. 51 00:02:33,200 --> 00:02:37,300 Help us and reclaim our public space. 52 00:02:37,300 --> 00:02:38,940 Claim your rights: 53 00:02:38,940 --> 00:02:42,080 To privacy, to freedom of speech, 54 00:02:42,080 --> 00:02:45,800 to protest, and to freedom from discrimination. 55 00:02:45,800 --> 00:02:49,480 What we need now are fair, non-discriminatory systems 56 00:02:49,480 --> 00:02:51,940 with a human sense of proportion and reason. 57 00:02:51,940 --> 00:02:53,620 This creates jobs 58 00:02:53,620 --> 00:02:56,160 and frees up urgently needed resources: 59 00:02:56,160 --> 00:02:59,080 Because the biggest problems that threaten our security, 60 00:02:59,080 --> 00:03:01,400 cannot be solved with biometric surveillance, 61 00:03:01,400 --> 00:03:06,140 profiling, and associated questionable fortune-telling. 62 00:03:06,140 --> 00:03:07,200 On the contrary: 63 00:03:07,200 --> 00:03:10,520 These are an additional danger to our rule of law 64 00:03:10,520 --> 00:03:12,980 and our most important fundamental rights. 65 00:03:12,980 --> 00:03:18,820 So sign the European Citizens' Initiative for a new law: 66 00:03:20,740 --> 00:03:23,000 For a ban on biometric mass surveillance! 67 00:03:23,000 --> 00:03:26,840 reclaimyourface.eu