1 00:00:00,000 --> 00:00:14,294 *rC3 preroll music* 2 00:00:14,294 --> 00:00:18,950 Herald: Welcome with me with a big round of applause in your living room or 3 00:00:18,950 --> 00:00:25,970 wherever you are derJoram. derJoram is a science communicator. He got his 4 00:00:25,970 --> 00:00:31,160 University education and his first scientific experience at Max Planck 5 00:00:31,160 --> 00:00:38,570 Institute. And he will give you now a crash course for beginners to have the 6 00:00:38,570 --> 00:00:44,850 best insight into the scientific method and to distinguish science from rubbish. 7 00:00:44,850 --> 00:01:03,542 derJoram, the stage is yours. 8 00:01:03,542 --> 00:01:07,980 derJoram: Hi, nice to have you here. My name is Joram Schwartzmann and I'm a plant 9 00:01:07,980 --> 00:01:12,830 biologist. And today I want to talk about science. I have worked in research for 10 00:01:12,830 --> 00:01:18,810 many years, first during my diploma thesis and then during my doctoral research. I've 11 00:01:18,810 --> 00:01:22,280 worked both in Universities and at the Max Planck Institute. So I got pretty good 12 00:01:22,280 --> 00:01:27,150 insights into the way these structures work. After my PhD, I left the research 13 00:01:27,150 --> 00:01:31,869 career to instead talk about science, which is also what I'm about to do today. 14 00:01:31,869 --> 00:01:36,710 I am working now in science communication, both as a job and in my spare time, when I 15 00:01:36,710 --> 00:01:41,070 write about molecular plant research online. Today, I will only mention plants 16 00:01:41,070 --> 00:01:45,350 a tiny bit because the topic is a different one. Today though, we are 17 00:01:45,350 --> 00:01:49,590 talking about science literacy. So basically, how does the scientific system 18 00:01:49,590 --> 00:01:53,430 work? How do you read scientific information and which information can you 19 00:01:53,430 --> 00:02:00,179 trust? Science. It's kind of a big topic. Before we start, it's time for some 20 00:02:00,179 --> 00:02:04,970 disclaimers: I am a plant biologist. I know stuff about STEM research that is 21 00:02:04,970 --> 00:02:09,289 science, technology, engineering and mathematics. But there's so much more 22 00:02:09,289 --> 00:02:13,920 other science out there. Social science and humanities share many core concepts 23 00:02:13,920 --> 00:02:19,019 with natural sciences, but have also many approaches that are unique to them. I 24 00:02:19,019 --> 00:02:21,840 don't know a lot about the way these works, so please forgive me if I stick 25 00:02:21,840 --> 00:02:26,900 close to what I know, which is STEM research. Talking about science is also 26 00:02:26,900 --> 00:02:31,230 much less precise than doing the science. For pretty much everything that I'll bring 27 00:02:31,230 --> 00:02:35,389 up today there is an example where it is completely different. So if in your 28 00:02:35,389 --> 00:02:39,709 country, field of research or experience something is different, we're probably 29 00:02:39,709 --> 00:02:44,120 both right about whatever we're talking. With that out of the way, let's look at 30 00:02:44,120 --> 00:02:48,629 the things that make science *science*. There are three parts of science that are 31 00:02:48,629 --> 00:02:53,319 connected. The first one is the scientific system. This is the way science is done. 32 00:02:53,319 --> 00:02:56,559 Next up, we have people, who do the science. The scientific term for them is 33 00:02:56,559 --> 00:03:00,819 researchers. We want to look at how you become a researcher, how researchers 34 00:03:00,819 --> 00:03:06,659 introduce biases and how they pick their volcanic layer to do evil science. 35 00:03:06,659 --> 00:03:11,230 Finally, there are publications and this is the front end of science, the stuff we 36 00:03:11,230 --> 00:03:15,249 look at most of the time when we look at science. There are several different kinds 37 00:03:15,249 --> 00:03:20,480 and not all of them are equally trustworthy. Let's begin with the 38 00:03:20,480 --> 00:03:26,299 scientific system. We just don't do science, we do science systematically. 39 00:03:26,299 --> 00:03:30,069 Since the first people tried to understand the world around them, we have developed a 40 00:03:30,069 --> 00:03:34,809 complex system for science. At the core of that is the scientific method. The 41 00:03:34,809 --> 00:03:39,339 scientific method gives us structure and tools to do science. Without it, we end up 42 00:03:39,339 --> 00:03:43,839 in the realm of guesswork, anecdotes and false conclusions. Here are some of my 43 00:03:43,839 --> 00:03:47,859 favorite things that were believed before the scientific method became standard. 44 00:03:47,859 --> 00:03:54,230 Gentlemen could not transmit disease. Mice are created from grain and cloth. Blood is 45 00:03:54,230 --> 00:03:59,906 exclusively produced by the liver. Heart shaped plants are good for the heart. But 46 00:03:59,906 --> 00:04:03,219 thanks to the scientific method, we have a system that allows us to make confident 47 00:04:03,219 --> 00:04:07,760 judgment on our observations. Let's use an example. This year has aged me 48 00:04:07,760 --> 00:04:13,349 significantly and so as a newly formed old person, I have pansies on my balcony. I 49 00:04:13,349 --> 00:04:17,450 have blue ones and yellow ones, and in summer I can see bees buzz around the 50 00:04:17,450 --> 00:04:21,740 flowers. I have a feeling, though, that they like the yellow ones better. That 51 00:04:21,740 --> 00:04:25,889 right there is an observation. I now think to myself *I wonder if they prefer the* 52 00:04:25,889 --> 00:04:31,702 *yellow flowers over the blue ones based on the color* and this is my hypothesis. The 53 00:04:31,702 --> 00:04:36,747 point of a hypothesis is to test it so I can accept it or reject it later. So I 54 00:04:36,747 --> 00:04:40,419 come up with a test. I count all bees that land on yellow flowers and on blue flowers 55 00:04:40,419 --> 00:04:45,840 within a weekend. That is my experiment. So I sit there all weekend with one of 56 00:04:45,840 --> 00:04:49,819 these clicky things in each hand and count the bees on the flowers. Every time a bee 57 00:04:49,819 --> 00:04:54,189 lands on a flower, I click. *click, click, click, click, click*. It's the most fun I 58 00:04:54,189 --> 00:04:59,930 had all summer. In the end, I look at my numbers. These are my results. I saw sixty 59 00:04:59,930 --> 00:05:03,700 four bees on the yellow flowers and twenty seven on the blue flowers. Based on my 60 00:05:03,700 --> 00:05:09,109 experiment I conclude that bees prefer yellow pansies over blue ones. I can now 61 00:05:09,109 --> 00:05:14,139 return and accept my hypothesis. Bees do prefer yellow flowers over blue ones. 62 00:05:14,139 --> 00:05:18,259 Based on that experiment I made a new observation and can now make a new 63 00:05:18,259 --> 00:05:22,860 hypothesis: do other insects follow the same behavior? And so I sat there again 64 00:05:22,860 --> 00:05:28,807 next weekend, counting all hoverflies on my pansies. Happy days. The scientists in 65 00:05:28,807 --> 00:05:33,910 the audience are probably screaming by now. I am, too, but on the inside. My 66 00:05:33,910 --> 00:05:38,340 little experiment and the conclusions I did were flawed. First up, I didn't do any 67 00:05:38,340 --> 00:05:43,689 controls apart from yellow versus blue. What about time? Do the days or seasons 68 00:05:43,689 --> 00:05:48,240 matter? Maybe I picked up the one time period when bees actually do prefer yellow 69 00:05:48,240 --> 00:05:52,419 but on most other days they like blue better? And then I didn't control for 70 00:05:52,419 --> 00:05:56,469 position. Maybe the blue ones get less sunlight and are less warm and so a good 71 00:05:56,469 --> 00:06:00,909 control would have been to swap the pots around. I also said I wanted to test 72 00:06:00,909 --> 00:06:05,009 color. Another good control would have been to put up a cardboard cutout of a 73 00:06:05,009 --> 00:06:09,199 flower in blue and yellow and see whether it is the color or maybe another factor 74 00:06:09,199 --> 00:06:14,389 that attracts the bees. And then I only counted once. I put the two data points 75 00:06:14,389 --> 00:06:17,639 into an online statistical calculator and when I had calculated it, it told me I had 76 00:06:17,639 --> 00:06:21,380 internet connectivity problems. So I busted out my old textbook about 77 00:06:21,380 --> 00:06:25,060 statistics. And as it turns out, you need repetitions of your experiment to do 78 00:06:25,060 --> 00:06:29,570 statistics and without statistics, you can't be sure of anything. If you want to 79 00:06:29,570 --> 00:06:33,389 know whether what you measure is random or truly different between your two 80 00:06:33,389 --> 00:06:37,270 conditions, you do a statistical test that tells you with what probability your 81 00:06:37,270 --> 00:06:42,340 result could be random. That is called a P-value. You want that number to be low. 82 00:06:42,340 --> 00:06:46,629 In biology, we're happy with a chance of one in twenty. So five percent that the 83 00:06:46,629 --> 00:06:50,870 difference we observe between two measurements happened by chance. In high 84 00:06:50,870 --> 00:06:54,760 energy particle physics, that chance of seeing a random effect is 1:3.500.000 85 00:06:54,760 --> 00:07:00,780 or 0.00003%. So without statistics, you can never be sure whether 86 00:07:00,780 --> 00:07:06,421 you observe something important or just two numbers that look different. A good 87 00:07:06,421 --> 00:07:10,310 way to do science is to do an experiment a couple of times, three at least, and then 88 00:07:10,310 --> 00:07:14,710 repeat it with controls again at least three times. With a bigger data set, I 89 00:07:14,710 --> 00:07:19,009 could actually make an observation that holds significance. So why do I tell you 90 00:07:19,009 --> 00:07:22,819 all of this? You want to know how to understand science not how to do it 91 00:07:22,819 --> 00:07:27,009 yourself? Well, as it turns out, controls and repetitions are also a critical point 92 00:07:27,009 --> 00:07:30,779 to check when you read about scientific results. Often enough cool findings are 93 00:07:30,779 --> 00:07:34,659 based on experiments that didn't control for certain things or that are based on 94 00:07:34,659 --> 00:07:38,819 very low numbers of repetitions. You have to be careful with conclusions from these 95 00:07:38,819 --> 00:07:43,580 experiments as they might be wrong. So when you read about science, look for 96 00:07:43,580 --> 00:07:47,169 science that they followed the scientific method like a clearly stated hypothesis, 97 00:07:47,169 --> 00:07:53,439 experiments with proper controls and enough repetitions to do solid statistics. 98 00:07:53,439 --> 00:07:56,730 It seems like an obvious improvement for the scientific system to just do more 99 00:07:56,730 --> 00:08:01,490 repetitions. Well, there is a problem with that. Often experiments require the 100 00:08:01,490 --> 00:08:05,180 researchers to break things. Maybe just because you take the things out of their 101 00:08:05,180 --> 00:08:08,460 environment and into your lab, maybe because you can only study it when it's 102 00:08:08,460 --> 00:08:13,379 broken. And as it turns out, not all things can be broken easily. Let me 103 00:08:13,379 --> 00:08:18,483 introduce you to my scale of how easy it is to break the thing you study. All the 104 00:08:18,483 --> 00:08:22,490 way to the left, you have things like particle physics. It's easy to break 105 00:08:22,490 --> 00:08:26,340 particles. All you need is a big ring and some spare electrons you put in there 106 00:08:26,340 --> 00:08:30,020 really, really fast. Once you have these two basic things, you can break millions 107 00:08:30,020 --> 00:08:33,690 of particles and measure what happens so you can calculate really good statistics 108 00:08:33,690 --> 00:08:38,314 on them. Then you have other areas of physics. In material science. the only 109 00:08:38,314 --> 00:08:42,750 thing that stops you from testing how hard a rock is, is the price of your rock. 110 00:08:42,750 --> 00:08:47,632 Again, that makes us quite confident in the material properties of things. Now we 111 00:08:47,632 --> 00:08:53,589 enter the realm of biology. Biology is less precise because living things are not 112 00:08:53,589 --> 00:08:58,550 all the same. If you take two bacterial cells of the same species, they might 113 00:08:58,550 --> 00:09:02,800 still be slightly different in their genome. But luckily we can break millions 114 00:09:02,800 --> 00:09:08,280 of bacteria and other microbes without running into ethical dilemmas. We even ask 115 00:09:08,280 --> 00:09:12,190 researchers to become better at killing microbes. So doing more of the experiment 116 00:09:12,190 --> 00:09:16,839 is easier when working with microbes. It gets harder, though, with bigger and more 117 00:09:16,839 --> 00:09:22,085 complex organisms. Want to break plants in a greenhouse or in a field? As long as you 118 00:09:22,085 --> 00:09:26,466 have the space, you can break thousands of them for science and no one minds. How 119 00:09:26,466 --> 00:09:29,800 about animals like fish and mice and monkeys? There it gets much more 120 00:09:29,800 --> 00:09:33,920 complicated very quickly. While we are happy to kill thousands of pigs every day 121 00:09:33,920 --> 00:09:37,630 for sausages, we feel much less comfortable doing the same for science. 122 00:09:37,630 --> 00:09:41,854 And it's not a bad thing when we try to reduce harm to animals. So while you 123 00:09:41,854 --> 00:09:46,300 absolutely can do repetitions and controls and animal testing, you usually are 124 00:09:46,300 --> 00:09:50,902 limited by the number of animals you can break for science. And then we come to 125 00:09:50,902 --> 00:09:55,420 human biology. If you thought it was hard doing lots of repetitions and controls in 126 00:09:55,420 --> 00:09:59,590 animals, try doing that in humans. You can't grow a human on a corn sugar based 127 00:09:59,590 --> 00:10:03,910 diet just to see what would happen. You can't grow humans in isolation and you 128 00:10:03,910 --> 00:10:08,619 can't breed humans to make more cancer as a control in your cancer experiment. So 129 00:10:08,619 --> 00:10:11,560 with anything that involves science in humans, we have to have very clever 130 00:10:11,560 --> 00:10:15,379 experiment design to control for all the things that we can't control. The other 131 00:10:15,379 --> 00:10:18,320 way to do science on humans, of course, is to be a genetic life form and disk- 132 00:10:18,320 --> 00:10:24,060 operating system. What this scale tells us is how careful we have to be with 133 00:10:24,060 --> 00:10:28,040 conclusions from any of these research areas. We have to apply a much higher 134 00:10:28,040 --> 00:10:32,690 skepticism when looking at single studies on human food than when we study how hard 135 00:10:32,690 --> 00:10:36,650 a rock is. If I'm interested in stuff on the right end of the spectrum, I'd rather 136 00:10:36,650 --> 00:10:40,519 see a couple of studies pointing at a conclusion. Whereas the further I get to 137 00:10:40,519 --> 00:10:44,769 the left hand side, the more I trust single studies. That still doesn't mean 138 00:10:44,769 --> 00:10:50,510 that there can't be mistakes in particle physics, but I hope you get the idea. Back 139 00:10:50,510 --> 00:10:55,279 to the scientific method. Because it is circular, it is never done, and so is 140 00:10:55,279 --> 00:10:59,180 science. We can always uncover more details, look at related things and refine 141 00:10:59,180 --> 00:11:04,182 our understanding. There's no field where we could ever say: Ok, let's pack up. We 142 00:11:04,182 --> 00:11:09,320 know now everything. Good job, everyone - the science has been completely done. 143 00:11:09,320 --> 00:11:13,120 Everything in science can be potentially overturned. Nothing is set in stone. 144 00:11:13,120 --> 00:11:18,430 However, and it's a big however, it's not likely that this happens for most things. 145 00:11:18,430 --> 00:11:21,700 Most things have been shown so often that the chance that we will find out that 146 00:11:21,700 --> 00:11:25,490 water actually boils at 250 degrees centigrade at sea level and normal 147 00:11:25,490 --> 00:11:30,510 pressure is close to zero. But if researchers would be able to show that 148 00:11:30,510 --> 00:11:35,170 strange behavior of water, it is in the nature of science to include that result 149 00:11:35,170 --> 00:11:39,610 in our understanding. Even if that breaks some other ideas that we have about the 150 00:11:39,610 --> 00:11:44,510 world. That is what sets science apart from dogma. New evidence is not frowned 151 00:11:44,510 --> 00:11:48,570 upon and rejected, but welcomed and integrated into our current understanding 152 00:11:48,570 --> 00:11:55,045 of the world. Enough about a scientific system. Let's talk about scientists. You 153 00:11:55,045 --> 00:11:59,368 might be surprised to hear, but most researchers are actually people. Other 154 00:11:59,368 --> 00:12:02,569 people, who are not researchers tend to forget that, especially when they talk 155 00:12:02,569 --> 00:12:07,270 about the science that the researchers do. That goes both ways. There are some that 156 00:12:07,270 --> 00:12:11,290 believe in the absolute objective truth of science. Ignoring all influence 157 00:12:11,290 --> 00:12:15,899 researchers have on the data. And there are others, who say that science is lying 158 00:12:15,899 --> 00:12:20,683 about things like vaccinations, climate change or infectious diseases. Both groups 159 00:12:20,683 --> 00:12:26,410 are wrong. Researchers are not infallible demigods that eat nature and poop wisdom. 160 00:12:26,410 --> 00:12:31,120 They're also not conspiring to bring harm to society in search for personal gain. 161 00:12:31,120 --> 00:12:35,019 Trust me. I know people, who work in pesticide research, they're as miserable 162 00:12:35,019 --> 00:12:39,660 as any other researcher. Researchers are people. And so they have thoughts and 163 00:12:39,660 --> 00:12:44,977 ideas and wishes and biases and faults and good intentions. Most people don't want to 164 00:12:44,977 --> 00:12:49,733 do bad things and inflict harm on others and so do researchers. They aim to do good 165 00:12:49,733 --> 00:12:55,540 things and make lives of people better. The problem with researchers being people 166 00:12:55,540 --> 00:13:00,279 is that they are also flawed. We all have cognitive biases that shape the way we 167 00:13:00,279 --> 00:13:04,339 perceive and think about the world. And in science, there's a whole list of biases 168 00:13:04,339 --> 00:13:08,681 that affect the way we gather data and draw conclusions from it. Luckily, there 169 00:13:08,681 --> 00:13:13,810 are ways to deal with most biases. We have to be aware of them, address them and 170 00:13:13,810 --> 00:13:20,709 change our behavior to avoid them. What we can't do is deny their impact on research. 171 00:13:20,709 --> 00:13:24,800 Another issue is diversity. Whenever you put a group of similar people together, 172 00:13:24,800 --> 00:13:28,730 they will only come up with ideas that fit within their group. That's why it is a 173 00:13:28,730 --> 00:13:33,800 problem when only white men are dominating research leadership positions. *Hold on*. 174 00:13:33,800 --> 00:13:39,209 Some of you might shout. *These men are men of science. They are objective. They* 175 00:13:39,209 --> 00:13:44,069 *use the scientific method. We don't need diversity. We need smart people*. To which 176 00:13:44,069 --> 00:13:50,190 I answer: *ugghhh*. Here is a story for you. For more than 150 years, researchers 177 00:13:50,190 --> 00:13:54,490 believed that only male birds are singing. It fits the simple idea that male birds do 178 00:13:54,490 --> 00:13:59,329 all the mating rituals and stuff, so they must be the singers. Just like in humans, 179 00:13:59,329 --> 00:14:03,019 female birds were believed to just sit and listen while the men shout at each other. 180 00:14:03,019 --> 00:14:07,870 In the last 20 years, this idea was debunked. New research found that also 181 00:14:07,870 --> 00:14:13,980 female birds sing. So how did we miss that for so long? Another study on the studies 182 00:14:13,980 --> 00:14:17,240 found that during these 20 years that overturned the dogma of male singing 183 00:14:17,240 --> 00:14:22,649 birds, the researchers changed. Suddenly, more women took part in research and 184 00:14:22,649 --> 00:14:27,404 research happened in more parts of the world. Previously, mostly men in U.S., 185 00:14:27,404 --> 00:14:31,780 Canada, England and Germany were studying singing birds in their countries. As a 186 00:14:31,780 --> 00:14:35,550 result, they subconsciously introduced their own biases and ideas into the work. 187 00:14:35,550 --> 00:14:40,851 And so we believe for a long time that female birds keep their beaks shut. Only 188 00:14:40,851 --> 00:14:46,217 when the group of researchers diversified, we got new and better results. The male 189 00:14:46,217 --> 00:14:50,226 researchers didn't ignore the female songbirds out of bad faith. The men were 190 00:14:50,226 --> 00:14:53,701 shaped by their environment but they didn't want to do bad things. They just 191 00:14:53,701 --> 00:14:56,889 happened to oversee something that someone with a different background would pick up 192 00:14:56,889 --> 00:15:02,190 on. What does this tell us about science? It tells us that science is influenced 193 00:15:02,190 --> 00:15:06,490 consciously or subconsciously by internal biases. When we talk about scientific 194 00:15:06,490 --> 00:15:10,930 results we need to take that into account. Especially in studies regarding human 195 00:15:10,930 --> 00:15:14,810 behavior. We have to be very careful about experiment design, framing and 196 00:15:14,810 --> 00:15:18,990 interpretation of results. If you read about science that makes bold claims about 197 00:15:18,990 --> 00:15:23,250 the way we should work, interact or communicate in society that science is 198 00:15:23,250 --> 00:15:26,940 prone to be shaped by bias and you should be very careful when drawing conclusions 199 00:15:26,940 --> 00:15:31,279 from it. I personally would rather wait for several studies pointing in a similar 200 00:15:31,279 --> 00:15:35,829 direction before I draw major conclusions. I linked to a story about a publication 201 00:15:35,829 --> 00:15:39,600 about the influence of female mentors on career success and it was criticized for a 202 00:15:39,600 --> 00:15:46,889 couple of these biases. If we want to understand science better, we also have to 203 00:15:46,889 --> 00:15:50,980 look at how someone becomes a scientist and I mean that in a sense of professional 204 00:15:50,980 --> 00:15:54,740 career. Technically, everybody is a scientist as soon as they test a 205 00:15:54,740 --> 00:15:58,890 hypothesis, observe the outcome and repeat. But unfortunately, most of us are 206 00:15:58,890 --> 00:16:03,299 not paid for the tiny experiments during our day to day life. If you want to become 207 00:16:03,299 --> 00:16:08,310 a scientist, you usually start by entering academia. Academia is the world of 208 00:16:08,310 --> 00:16:12,000 Universities, Colleges and research institutes. There is a lot of science done 209 00:16:12,000 --> 00:16:16,740 outside of academia, like in research and development in industry or by individuals 210 00:16:16,740 --> 00:16:21,029 taking part in DIY science. As these groups rarely enter the spotlight of 211 00:16:21,029 --> 00:16:26,709 public attention, I will ignore them today. Sorry. So this is a typical STEM 212 00:16:26,709 --> 00:16:31,240 career path. You begin as a Bachelor's or Master's student. You work for something 213 00:16:31,240 --> 00:16:35,549 between three months and a year and then *wohoo* you get a degree. From here you 214 00:16:35,549 --> 00:16:39,689 can leave, go into the industry, be a scientific researcher at a University or 215 00:16:39,689 --> 00:16:44,709 you continue your education. If you continue, you're most likely to do a PhD. 216 00:16:44,709 --> 00:16:47,649 But before you can select one of the exciting options on a form when you order 217 00:16:47,649 --> 00:16:51,889 your food, you have to do research. For three to six years, depending on where you 218 00:16:51,889 --> 00:16:56,660 do your PhD, you work on a project and most likely will not have a great time. 219 00:16:56,660 --> 00:17:00,959 You finish with your degree and some publications. A lot of people leave now 220 00:17:00,959 --> 00:17:05,810 but if you stay in research, you'll become a postdoc. The word postdoc comes from the 221 00:17:05,810 --> 00:17:09,800 word "doc" as in doctorate and "post" as in you have to post a lot of application 222 00:17:09,800 --> 00:17:18,050 letters to get a job. Postdocs do more research, often on broader topics. They 223 00:17:18,050 --> 00:17:21,910 supervise PhD students and are usually pretty knowledgeable about their research 224 00:17:21,910 --> 00:17:26,432 field. They work and write papers until one of two things happen. The German 225 00:17:26,432 --> 00:17:30,150 Wissenschaftszeitvertragsgesetz bites them in the butt and they get no more contract 226 00:17:30,150 --> 00:17:34,730 or they move on to become a group leader or professor. Being a professor is great. 227 00:17:34,730 --> 00:17:37,770 You have a permanent research position, you get to supervise and you get to talk 228 00:17:37,770 --> 00:17:42,260 to many cool other researchers. You probably know a lot by now, not only about 229 00:17:42,260 --> 00:17:46,530 your field but also many other fields in your part of science as you constantly go 230 00:17:46,530 --> 00:17:50,910 to conferences because they have good food and also people are talking about science. 231 00:17:50,910 --> 00:17:55,870 Downside is, you're probably not doing any experiments yourself anymore. You have 232 00:17:55,870 --> 00:18:01,010 postdocs and PhD students, who do that for you. If you want to go into science, 233 00:18:01,010 --> 00:18:04,740 please have a look at this. What looks like terrible city planning is actually 234 00:18:04,740 --> 00:18:09,190 terrible career planning as less than one percent of PhDs will ever reach the level 235 00:18:09,190 --> 00:18:13,940 of professor, also known as the only stable job in science. That's also what 236 00:18:13,940 --> 00:18:20,450 happened to me, I left academia after my PhD. So what do we learn from all of this? 237 00:18:20,450 --> 00:18:23,420 Different stages of a research career correlate with different levels of 238 00:18:23,420 --> 00:18:27,490 expertise. If you read statements from a Master's student or professor, you can get 239 00:18:27,490 --> 00:18:31,220 an estimate for how much they know about their field and in turn for how solid 240 00:18:31,220 --> 00:18:35,270 their science is. Of course, this is just a rule of thumb- I have met both very 241 00:18:35,270 --> 00:18:38,440 knowledgeable Master's students and professors, who knew nothing apart from 242 00:18:38,440 --> 00:18:43,990 their own small work. So whenever you read statements from researchers independent of 243 00:18:43,990 --> 00:18:47,830 their career stage, you should also wonder whether they represent the scientific 244 00:18:47,830 --> 00:18:52,160 consensus. Any individual scientist might have a particular hot take about something 245 00:18:52,160 --> 00:18:57,040 they care about but in general, they agree with their colleagues. When reading about 246 00:18:57,040 --> 00:19:00,530 science that relates to policies or public debates, it is a good idea to explore 247 00:19:00,530 --> 00:19:04,940 whether this particular researcher is representing their own opinion or the one 248 00:19:04,940 --> 00:19:09,050 of their peers. Don't ask the researcher directly though, every single one of them 249 00:19:09,050 --> 00:19:16,530 will say that, of course, they represent the majority opinion. The difference 250 00:19:16,530 --> 00:19:21,380 between science and screwing around is writing it down, as Adam Savage once said. 251 00:19:21,380 --> 00:19:24,760 Science without publications is pretty useless because if you keep all that 252 00:19:24,760 --> 00:19:29,290 knowledge to yourself, well, congrats, you are very smart now but that doesn't really 253 00:19:29,290 --> 00:19:33,720 help anyone but you. Any researchers' goal, therefore, is to get their findings 254 00:19:33,720 --> 00:19:38,500 publicly known so that others can extend the work and create scientific progress. 255 00:19:38,500 --> 00:19:43,000 So let's go back to my amazing bee research. I did the whole experiment again 256 00:19:43,000 --> 00:19:47,233 with proper controls this time and now I want to tell people about it. The simplest 257 00:19:47,233 --> 00:19:51,570 way to publish my findings would be to tweet about it. But then a random guy 258 00:19:51,570 --> 00:19:56,036 would probably tell me that I'm wrong and stupid and should go f*** myself. So 259 00:19:56,036 --> 00:20:00,850 instead I do what most researchers would do and go to a scientific conference. 260 00:20:00,850 --> 00:20:04,470 That's where researchers hang out, have a lot of coffee and sit and listen to talks 261 00:20:04,470 --> 00:20:08,120 from other researchers. Conferences are usually the first place that new 262 00:20:08,120 --> 00:20:13,250 information becomes public. Well, public is a bit of a stretch, usually the talks 263 00:20:13,250 --> 00:20:17,690 are not really recorded or made accessible to anyone, who wasn't there at the time. 264 00:20:17,690 --> 00:20:20,740 So while the information is pretty trustworthy, it remains fairly 265 00:20:20,740 --> 00:20:25,170 inaccessible to others. After my conference talk, the next step is to write 266 00:20:25,170 --> 00:20:29,845 up all the details of my experiment and the results in a scientific paper. Before 267 00:20:29,845 --> 00:20:33,930 I send this to an editor at a scientific journal, I could publish it myself as a 268 00:20:33,930 --> 00:20:38,620 pre-print. These pre-prints are drafts of finished papers that are available to read 269 00:20:38,620 --> 00:20:43,120 for anyone. They are great because they provide easy access to information that is 270 00:20:43,120 --> 00:20:47,170 otherwise often behind paywalls. They are not so great because they have not yet 271 00:20:47,170 --> 00:20:51,751 been peer reviewed. If a pre-print hasn't also been published with peer review, you 272 00:20:51,751 --> 00:20:55,500 have to be careful with what you read as it is essentially only the point of view 273 00:20:55,500 --> 00:21:01,186 of the authors. Peer review only happens when you submit your paper to a journal. 274 00:21:01,186 --> 00:21:04,720 Journals are a whole thing and there have been some great talks in the past about 275 00:21:04,720 --> 00:21:08,900 why many of them are problematic. Let's ignore for a second how these massive 276 00:21:08,900 --> 00:21:12,490 enterprises collect money from everyone they get in contact with and let's focus 277 00:21:12,490 --> 00:21:17,170 instead on what they're doing for the academic system. I send them my paper, an 278 00:21:17,170 --> 00:21:21,540 editor sees if it's any good and then sends my paper to two to three reviewers. 279 00:21:21,540 --> 00:21:25,370 These are other researchers that then critically check everything I did and 280 00:21:25,370 --> 00:21:30,440 eventually recommend accepting or rejecting my paper. If it is accepted, the 281 00:21:30,440 --> 00:21:35,260 paper will be published. I pay a fee and the paper will be available online. Often 282 00:21:35,260 --> 00:21:40,180 behind a paywall, unless I pay some more cash. At this point, I'd like to have a 283 00:21:40,180 --> 00:21:44,330 look at how a scientific paper works. There are five important parts to any 284 00:21:44,330 --> 00:21:49,543 paper. The title, the author list, the abstract, the figures and the text. The 285 00:21:49,543 --> 00:21:53,310 title is a summary of the main findings and unlike in popular media, it is much 286 00:21:53,310 --> 00:21:57,250 more descriptive. Where a newspaper leaves out the most important information to get 287 00:21:57,250 --> 00:22:00,900 people to read the article, the information is right there in the title of 288 00:22:00,900 --> 00:22:06,600 the study. In my case that could be "Honeybees -Apis mellifera- show selective 289 00:22:06,600 --> 00:22:11,120 preference for flower color in viola tricolor". You see, everything is right 290 00:22:11,120 --> 00:22:15,793 there. The organisms I worked with and the main result I found. Below the title 291 00:22:15,793 --> 00:22:19,640 stands the author list. As you might have guessed, the author list is a list of 292 00:22:19,640 --> 00:22:23,320 authors. Depending on the field the paper is from, the list can be ordered 293 00:22:23,320 --> 00:22:28,280 alphabetically or according to relative contribution. If it is contribution then 294 00:22:28,280 --> 00:22:32,120 you usually find the first author to have done all the work or the middle authors to 295 00:22:32,120 --> 00:22:35,350 have contributed some smaller parts and the last author to have paid for the whole 296 00:22:35,350 --> 00:22:40,130 thing. The last author is usually a group leader or professor. A good way to learn 297 00:22:40,130 --> 00:22:45,400 more about a research group and their work is to search for the last author's name. The 298 00:22:45,400 --> 00:22:49,000 abstract is a summary of the findings. Read this to get a general idea of what 299 00:22:49,000 --> 00:22:53,270 the researchers did and what they found. It is very dense in information but it is 300 00:22:53,270 --> 00:22:56,420 usually written in a way that also researchers from other fields can 301 00:22:56,420 --> 00:23:01,730 understand at least some of it. The figures are pretty to look at and hold the 302 00:23:01,730 --> 00:23:07,090 key findings in most papers and the text has the full story with all the details or 303 00:23:07,090 --> 00:23:11,840 the jargon and all your references that the research is built on. You probably 304 00:23:11,840 --> 00:23:16,300 won't read the text unless you care a lot, so stick to title, abstract and authors to 305 00:23:16,300 --> 00:23:20,690 get a quick understanding of what's going on. Scientific papers to reflect a peer 306 00:23:20,690 --> 00:23:25,820 reviewed opinion of one or a few research groups. If you are interested in a broader 307 00:23:25,820 --> 00:23:30,610 topic like what insects like to pollinate what flower, you should read review 308 00:23:30,610 --> 00:23:35,110 papers. These are peer reviewed summaries of a much broader scope, often weighing 309 00:23:35,110 --> 00:23:39,500 multiple points of view against each other. Review papers are a great resource 310 00:23:39,500 --> 00:23:43,510 that avoids some of the biases individual research groups might have about their 311 00:23:43,510 --> 00:23:48,590 topic. So my research is reviewed and published. I can go back now and start 312 00:23:48,590 --> 00:23:52,100 counting butterflies, but this is not where the publishing of scientific results 313 00:23:52,100 --> 00:23:56,860 ends. My institute might think that my bee counting is not even bad, it is actually 314 00:23:56,860 --> 00:24:01,290 amazing and so they will issue a press release. Press releases often emphasize 315 00:24:01,290 --> 00:24:04,760 the positive parts of a study while putting them into context of something 316 00:24:04,760 --> 00:24:08,770 that's relevant to most people. Something like "bees remain attracted to yellow 317 00:24:08,770 --> 00:24:13,010 flowers despite the climate crisis". The facts in a press release are usually 318 00:24:13,010 --> 00:24:17,190 correct but shortcomings of a study that I mentioned in a paper are often missing 319 00:24:17,190 --> 00:24:22,670 from the press release. Because my bee study is really cool and because the PR 320 00:24:22,670 --> 00:24:27,760 department of my institute did a great job, journalists pick up on the story. The 321 00:24:27,760 --> 00:24:31,950 first ones are often journals with a focus on science like *Scientific American* or 322 00:24:31,950 --> 00:24:35,750 *Spektrum der Wissenschaft*. Most of the time, science journalists do a great job 323 00:24:35,750 --> 00:24:40,490 in finding more sources and putting the results into context. They often ask other 324 00:24:40,490 --> 00:24:44,230 experts for their opinion and they break down the scientific language into simpler 325 00:24:44,230 --> 00:24:48,300 words. Science journalism is the source I recommend to most people when they want to 326 00:24:48,300 --> 00:24:52,500 learn about a field that they are not experts in. Because my bee story is 327 00:24:52,500 --> 00:24:57,180 freaking good, mainstream journalists are also reporting on it. They are often 328 00:24:57,180 --> 00:25:00,150 pressed for time and write for much broader audience, so they just report the 329 00:25:00,150 --> 00:25:05,450 basic findings, often putting even more emphasis on why people should care. 330 00:25:05,450 --> 00:25:10,980 Usually climate change, personal health or now Covid. Mainstream press coverage is 331 00:25:10,980 --> 00:25:14,920 rarely as detailed as the previous reporting and has the strongest tendency 332 00:25:14,920 --> 00:25:20,417 to accidentally misrepresent facts or add framing that researchers wouldn't use. Oh, 333 00:25:20,417 --> 00:25:23,140 and then there is the weird uncle, who posts a link to the article on their 334 00:25:23,140 --> 00:25:26,500 Facebook with a blurb of text that says the opposite of what the study actually 335 00:25:26,500 --> 00:25:31,750 did. As you might imagine, the process of getting scientific information out to the 336 00:25:31,750 --> 00:25:35,660 public quickly becomes a game of telephone. What is clearly written in the 337 00:25:35,660 --> 00:25:39,480 paper is framed positively in a press release and gets watered down even more 338 00:25:39,480 --> 00:25:44,170 once it reaches mainstream press. So for you, as someone, who wants to understand 339 00:25:44,170 --> 00:25:48,380 the science, it is a good idea to be more careful the further you get away from your 340 00:25:48,380 --> 00:25:52,710 original source material. While specific scientific journalism usually does a good 341 00:25:52,710 --> 00:25:56,540 job in breaking down the facts without distortion, the same can't be said for 342 00:25:56,540 --> 00:26:01,310 popular media. If you come across an interesting story, try to find another 343 00:26:01,310 --> 00:26:05,530 version of it in a different outlet, preferably one that is more catered to an 344 00:26:05,530 --> 00:26:09,320 audience with scientific interest. Of course, you can jump straight to the 345 00:26:09,320 --> 00:26:13,250 original paper but understanding the scientific jargon can be hard and 346 00:26:13,250 --> 00:26:17,960 misunderstanding the message is easy, so it can do more harm than good. We see that 347 00:26:17,960 --> 00:26:23,640 harm now with Hobbyists, when epidimi..., epidimio..., epediomiolo.., who are not 348 00:26:23,640 --> 00:26:28,180 people, who study epidemics, who are making up their own pandemic modeling. 349 00:26:28,180 --> 00:26:31,550 They are cherry picking bits of information from scientific papers without 350 00:26:31,550 --> 00:26:35,230 understanding the bigger picture and context and then post their own charts on 351 00:26:35,230 --> 00:26:39,851 Twitter. It's cool if you want to play with data in your free time, and it's a 352 00:26:39,851 --> 00:26:44,510 fun way to learn more about a topic but it can also be very misleading and harmful 353 00:26:44,510 --> 00:26:48,390 while dealing with a pandemic if expert studies have to fight for attention with 354 00:26:48,390 --> 00:26:52,940 nonexperts Excel-graphs. It pays off to think twice about whether you're actually 355 00:26:52,940 --> 00:26:59,280 helping by publishing your own take on a scientific question. Before we end, I want 356 00:26:59,280 --> 00:27:03,600 to give you some practical advice on how to assess the credibility of a story and 357 00:27:03,600 --> 00:27:08,470 how to understand the science better. This is now an in-depth guide to fact checking. 358 00:27:08,470 --> 00:27:12,760 I want you to get a sort of gut feeling about science. When I read scientific 359 00:27:12,760 --> 00:27:18,320 information, these are the questions that come to my mind. First up, I want to ask 360 00:27:18,320 --> 00:27:23,499 yourself, is this plausible and does this follow the scientific consensus? If both 361 00:27:23,499 --> 00:27:28,610 answers are "no" then you should carefully check the sources. More often than not, 362 00:27:28,610 --> 00:27:32,530 these results are outliers that somebody exaggerated to get news coverage or 363 00:27:32,530 --> 00:27:37,676 someone is actively reframing scientific information for their own goals. To get a 364 00:27:37,676 --> 00:27:41,280 feeling about scientific consensus on things, it is a good idea to look for 365 00:27:41,280 --> 00:27:45,280 joint statements from research communities. Whenever an issue that is 366 00:27:45,280 --> 00:27:49,850 linked to current research comes up for public debate, there is usually a joint 367 00:27:49,850 --> 00:27:53,550 statement laying down the scientific opinion signed by dozens or even hundreds 368 00:27:53,550 --> 00:27:59,150 of researchers, like, for example, from Scientists for Future. And then whenever 369 00:27:59,150 --> 00:28:03,640 you see a big number, you should look for context. When you read statements like "We 370 00:28:03,640 --> 00:28:08,760 grow sugar beet on an area of over 400,000 hectare", you should immediately ask 371 00:28:08,760 --> 00:28:14,610 yourself "Who is we? Is it Germany, Europe, the world? What is the time frame? 372 00:28:14,610 --> 00:28:20,954 Is that per year? Is that a lot? How much is that compared to other crops?". Context 373 00:28:20,954 --> 00:28:26,620 matters a lot and often big numbers are used to impress you. In this case, 400,000 374 00:28:26,620 --> 00:28:32,360 hectare is the yearly area that Germany grows sugar beet on. A wheat, for example, 375 00:28:32,360 --> 00:28:37,870 is grown on over 3 million hectare per year in Germany. Context matters, and so 376 00:28:37,870 --> 00:28:42,100 whenever you see a number, look for a frame of reference. If the article doesn't 377 00:28:42,100 --> 00:28:45,960 give you one, either, go and look for yourself or ignore the number for your 378 00:28:45,960 --> 00:28:50,290 decision making based on the article. Numbers only work with framing, so be 379 00:28:50,290 --> 00:28:54,840 aware of it. I want you to think briefly about how you felt when I gave you that 380 00:28:54,840 --> 00:29:00,370 number of 400,000 hectare. Chances are that you felt a sort of feeling of unease 381 00:29:00,370 --> 00:29:05,010 because it's really hard to imagine such a large number. An interesting exercise is 382 00:29:05,010 --> 00:29:09,630 to create your own frame of reference. Collect a couple of numbers like total 383 00:29:09,630 --> 00:29:13,750 agricultural area of your country, the current spending budget of your 384 00:29:13,750 --> 00:29:17,620 municipality, the average yearly income, or the unemployment rate in relative and 385 00:29:17,620 --> 00:29:21,850 absolute numbers. Keep the list somewhere accessible and use it whenever you come 386 00:29:21,850 --> 00:29:27,390 across a big number that is hard to grasp. Are 100,000€ a lot of money in context of 387 00:29:27,390 --> 00:29:32,070 public spending? How important are 5,000 jobs in context of population and 388 00:29:32,070 --> 00:29:36,350 unemployment? Such a list can defuze the occasional scary big number in news 389 00:29:36,350 --> 00:29:41,640 articles, and it can also help you to make your point better. Speaking of framing, 390 00:29:41,640 --> 00:29:45,730 always be aware, who the sender of the information is. News outlets rarely have a 391 00:29:45,730 --> 00:29:52,360 specific scientific agenda, but NGOs do. If Shell, the oil company, will provide a 392 00:29:52,360 --> 00:29:56,390 leaflet where they cite scary numbers and present research that they funded that 393 00:29:56,390 --> 00:30:00,060 finds that oil drilling is actually good for the environment but they won't 394 00:30:00,060 --> 00:30:03,870 disclose, who they work with for the study, we all would laugh at that 395 00:30:03,870 --> 00:30:07,910 information. But if we read a leaflet from an environmental NGO in Munich that is 396 00:30:07,910 --> 00:30:11,260 structurally identical but with a narrative about glyphosate in beer that 397 00:30:11,260 --> 00:30:15,350 fits our own perception of the world, we are more likely to accept the information 398 00:30:15,350 --> 00:30:19,010 in the leaflet. In my opinion, both sources are problematic and I would not 399 00:30:19,010 --> 00:30:25,440 use any of them to build my own opinion. Good journalists put links to the sources 400 00:30:25,440 --> 00:30:30,461 in or under the article, and it is a good idea to check them. Often, however, you 401 00:30:30,461 --> 00:30:34,860 have to look for the paper yourself based on hints in the text like author names, 402 00:30:34,860 --> 00:30:39,910 institutions, and general topics. And then paywalls often block access to the 403 00:30:39,910 --> 00:30:44,280 information that you're looking for. You can try pages like ResearchGate for legal 404 00:30:44,280 --> 00:30:49,580 access to PDFs. Many researchers also use sci-hub but as the site provides illegal 405 00:30:49,580 --> 00:30:55,000 access to publicly funded research, I won't recommend doing so. When you have 406 00:30:55,000 --> 00:30:59,330 the paper in front of you, you can either read it completely, which is kind of hard, 407 00:30:59,330 --> 00:31:03,900 or just read the abstract, which might be easier. The easiest is to look for science 408 00:31:03,900 --> 00:31:09,330 journalism articles about the paper. Twitter is actually great to find those, 409 00:31:09,330 --> 00:31:12,380 as many researchers are on Twitter and like to share articles about their own 410 00:31:12,380 --> 00:31:16,500 research They also like to discuss research on Twitter. So if the story is 411 00:31:16,500 --> 00:31:20,380 controversial, chances are you'll find some science accounts calling that out. 412 00:31:20,380 --> 00:31:24,650 While Twitter is terrible in many regards, it is a great tool to engage with the 413 00:31:24,650 --> 00:31:30,320 scientific community. You can also do a basic check-up yourself. Where was the 414 00:31:30,320 --> 00:31:34,160 paper published and is it a known journal? Who are the people doing the research and 415 00:31:34,160 --> 00:31:39,260 what are their affiliations? How did they do their experiment? Checking for controls 416 00:31:39,260 --> 00:31:43,200 and repetitions in the experiment is hard if you don't know the topic, but if you do 417 00:31:43,200 --> 00:31:49,534 know the topic, go for it. In the end, fact checking takes time and energy. It's 418 00:31:49,534 --> 00:31:53,200 very likely that you won't do it very often but especially when something comes 419 00:31:53,200 --> 00:31:57,231 up that really interests you and you want to tell people about it, you should do a 420 00:31:57,231 --> 00:32:01,960 basic fact-check on the science. The world would be a lot better if you'd only share 421 00:32:01,960 --> 00:32:06,860 information that you checked yourself for plausibility. You can also help to reduce 422 00:32:06,860 --> 00:32:10,990 the need for rigorous fact checking. Simply do not spread any sane stories that 423 00:32:10,990 --> 00:32:14,690 seem too good to be true and that you didn't check yourself or find in a 424 00:32:14,690 --> 00:32:19,100 credible source. Misinformation and bad science reporting spread because we don't 425 00:32:19,100 --> 00:32:23,820 care enough and because they are very, very attractive. If we break that pattern, 426 00:32:23,820 --> 00:32:26,850 we can give reliable scientific information the attention that it 427 00:32:26,850 --> 00:32:31,160 deserves. But don't worry, most of the science reporting you'll find online is 428 00:32:31,160 --> 00:32:35,150 actually pretty good. There is no need to be extremely careful with every article 429 00:32:35,150 --> 00:32:40,060 you find. Still, I think it is better to have a natural alertness to badly reported 430 00:32:40,060 --> 00:32:45,415 signs than to trust just anything that is posted under a catchy headline. There is 431 00:32:45,415 --> 00:32:49,800 no harm in double checking the facts because either you correct a mistake or 432 00:32:49,800 --> 00:32:55,980 you reinforce correct information in your mind. So how do I assess whether a source 433 00:32:55,980 --> 00:33:00,720 that I like is actually good? When I come across a new outlet, I try to find some 434 00:33:00,720 --> 00:33:05,710 articles in an area that I know stuff about. For me, that's plant science. I 435 00:33:05,710 --> 00:33:08,970 then read what they are writing about plants. If that sounds plausible, I am 436 00:33:08,970 --> 00:33:12,410 tempted to also trust when they write about things like physics or climate 437 00:33:12,410 --> 00:33:17,870 change, where I have much less expertize. This way I have my own personal list of 438 00:33:17,870 --> 00:33:22,620 good and not so good outlets. If somebody on Twitter links to an article from the 439 00:33:22,620 --> 00:33:26,360 not so good list, I know that I have to take that information with a large 440 00:33:26,360 --> 00:33:30,490 quantity of salt. And if I want to learn more, I look for a different source to 441 00:33:30,490 --> 00:33:37,710 back up any claims I find. It is tedious but so is science. With a bit of practice, 442 00:33:37,710 --> 00:33:41,240 you can internalize the skepticism and navigate science information with much 443 00:33:41,240 --> 00:33:47,499 more confidence. I hope I could help you with that a little bit. So that was my 444 00:33:47,499 --> 00:33:50,970 attempt to help you to understand science better. I'd be glad if you'd leave me 445 00:33:50,970 --> 00:33:55,233 feedback or direct any of your questions towards me on Twitter. That's 446 00:33:55,233 --> 00:33:59,080 @sciencejoram. There will be sources for the things I talked about available 447 00:33:59,080 --> 00:34:04,430 somewhere around this video or on my website: joram.schwartzmann.de. Thank you 448 00:34:04,430 --> 00:34:10,676 for your attention. Goodbye. 449 00:34:10,676 --> 00:34:15,450 Herald: derJoram, thank you for your talk, very entertaining and informative as well 450 00:34:15,450 --> 00:34:23,480 as I might say. We have a few questions from here at the Congress that would be... 451 00:34:23,480 --> 00:34:26,929 where's the signal? I need my questions from the internet - all of them are from 452 00:34:26,929 --> 00:34:28,929 the Internet. Joram: *laughs* 453 00:34:28,929 --> 00:34:37,539 H: So I would go through the questions and you can elaborate on some of the points 454 00:34:37,539 --> 00:34:41,529 from your talk. So the first question... J: yeah, I will. 455 00:34:41,529 --> 00:34:47,829 H: very good. The first question is: Is there a difference between reviewed 456 00:34:47,829 --> 00:34:55,700 articles and meta studies? J: To my knowledge, there isn't really a 457 00:34:55,700 --> 00:35:00,430 categorical difference in terms of peer review. Meta studies, so studies that 458 00:35:00,430 --> 00:35:05,220 integrate, especially in the medical field you find that often, they integrate a lot 459 00:35:05,220 --> 00:35:10,259 of studies and then summarize the findings again and try to put them in context of 460 00:35:10,259 --> 00:35:18,920 one another, which are incredibly useful studies for medical conclusion making. 461 00:35:18,920 --> 00:35:23,630 Because as I said in the talk, it's often very hard to do, for example, dietary 462 00:35:23,630 --> 00:35:28,569 studies and you want to have large numbers and you get that by combining several 463 00:35:28,569 --> 00:35:33,730 studies together. And usually these meta studies are also peer reviewed. So instead 464 00:35:33,730 --> 00:35:39,330 of actually doing the research and going and doing whatever experiments you want to 465 00:35:39,330 --> 00:35:46,100 do on humans, you instead collect all of the evidence others state, and then you 466 00:35:46,100 --> 00:35:49,480 integrate it again, draw new conclusions from that and compare them and weigh them 467 00:35:49,480 --> 00:35:55,240 and say "OK, this study had these shortcomings but we can take this part 468 00:35:55,240 --> 00:35:59,641 from this study and put it in context with this part from his other study" because 469 00:35:59,641 --> 00:36:04,630 you make so much additional conclusion making on that, you then submit it again 470 00:36:04,630 --> 00:36:08,869 to a journal and it's again peer reviewed and then other researchers look at it and 471 00:36:08,869 --> 00:36:12,650 say, and yeah, pretty much give their expertize on it and say whether or not it 472 00:36:12,650 --> 00:36:17,079 made sense what you concluded from all of these things. So a meta study, when it's 473 00:36:17,079 --> 00:36:21,599 published in a scientific journal, is also peer reviewed and also a very good, 474 00:36:21,599 --> 00:36:25,960 credible source. And I would even say often meta studies are the studies that 475 00:36:25,960 --> 00:36:30,601 you really want to look for if you have a very specific scientific question that you 476 00:36:30,601 --> 00:36:36,560 as a sort of non expert, want to have answered because very often the individual 477 00:36:36,560 --> 00:36:40,510 studies, they are very focused on a specific detail of a bigger research 478 00:36:40,510 --> 00:36:44,759 question. But if you want to know is, I don't know, dietary fiber very good for 479 00:36:44,759 --> 00:36:49,339 me. There's probably not a single study that will have the answer but there will 480 00:36:49,339 --> 00:36:53,609 be many studies that together point towards the answer. And the meta study is 481 00:36:53,609 --> 00:36:59,230 a place where you can find that answer. H: Very good, sounds like something to 482 00:36:59,230 --> 00:37:05,740 reinforce the research. Maybe a follow-up question or it is a follow-up question: Is 483 00:37:05,740 --> 00:37:12,150 there anything you can say in this regards about the reproducibility crisis in many 484 00:37:12,150 --> 00:37:16,641 fields such as medicine? J: Yeah, that's a very good point. I mean, 485 00:37:16,641 --> 00:37:20,900 that's something that I didn't mention at all in the talk because for pretty much 486 00:37:20,900 --> 00:37:26,410 like complexity reasons because when you go into reproducibility, you run into all 487 00:37:26,410 --> 00:37:33,569 kinds of, sort of complex additional problems because it is true that we often 488 00:37:33,569 --> 00:37:40,309 struggle with reproducing. I actually don't have the numbers how often we fail 489 00:37:40,309 --> 00:37:45,290 but this reproducibility crisis that's often mentioned - that is this idea that 490 00:37:45,290 --> 00:37:49,700 when researchers take a paper that has whatever they studied and then other 491 00:37:49,700 --> 00:37:54,329 researchers try to recreate a study and usually in a paper, there's also a 492 00:37:54,329 --> 00:37:58,279 'Material & Method' section that details all of the things that they did. It's 493 00:37:58,279 --> 00:38:01,769 pretty much the instructions of the experiment. And the results of the 494 00:38:01,769 --> 00:38:04,410 experiment are both in the same paper usually - and when they try to sort of 495 00:38:04,410 --> 00:38:09,559 recook the recipe that somebody else did, there is a chance that they don't find the 496 00:38:09,559 --> 00:38:13,299 same thing. And we see that more and more often, especially with like complex 497 00:38:13,299 --> 00:38:17,859 research questions. And that brings us to the idea that reproduction or 498 00:38:17,859 --> 00:38:24,109 reproducibility is an issue and that maybe we we can't trust science as much or we 499 00:38:24,109 --> 00:38:30,509 have to be more careful. It is true that we have to be more careful. But I wouldn't 500 00:38:30,509 --> 00:38:36,425 go as far and to be like in general, sort of a distrustful of research. And that's 501 00:38:36,425 --> 00:38:39,369 why I'm also saying, like in the medical field, you always want to have multiple 502 00:38:39,369 --> 00:38:43,789 studies pointing at something. You always want to have multiple lines of evidence 503 00:38:43,789 --> 00:38:50,410 because if one group finds something and another group can't find it, like 504 00:38:50,410 --> 00:38:56,640 reproduce it, you end up in a place where you can't really say "Did this work now? 505 00:38:56,640 --> 00:39:00,500 Like, who did the mistake? The first group or the second group? " Because also when 506 00:39:00,500 --> 00:39:03,329 you were producing a study, you can make mistakes or there can be factors that the 507 00:39:03,329 --> 00:39:08,480 initial research study didn't document in a way that it can be reproduced because 508 00:39:08,480 --> 00:39:13,039 they didn't care to write down the supply of some chemicals, and the chemicals were 509 00:39:13,039 --> 00:39:16,619 very important for the success of the experiment. Things like that happen and so 510 00:39:16,619 --> 00:39:20,630 you don't know when you just have the initial study or the production study and 511 00:39:20,630 --> 00:39:25,180 they have a different outcome. But if you have then multiple studies that all look 512 00:39:25,180 --> 00:39:31,849 in a similar area and out of 10 studies, 8 or 7 point to do a certain direction, you 513 00:39:31,849 --> 00:39:37,170 can then be more certain that this direction points towards the truth. In 514 00:39:37,170 --> 00:39:42,040 science, it's really hard to say, like *OK, this is now the objective truth. This* 515 00:39:42,040 --> 00:39:47,080 *is now.. we found now the definitive answer to the question that we're looking* 516 00:39:47,080 --> 00:39:53,849 *at, especially in the medical field.* So, yeah.. So that's a very long way of saying 517 00:39:53,849 --> 00:39:58,530 it's complicated reproduction or reproducibility studies, they are very 518 00:39:58,530 --> 00:40:06,519 important but I wouldn't be too worried or too - what's the word here? Like, I 519 00:40:06,519 --> 00:40:11,510 wouldn't be too worried that the lack of reproducibility breaks the entire 520 00:40:11,510 --> 00:40:18,050 scientific method because it's usually more complex and more issues at hand than 521 00:40:18,050 --> 00:40:22,490 just a simple recooking of another person's study. 522 00:40:22,490 --> 00:40:31,920 H: Yes, speaking of more publishing, so this is a follow-up to the follow-up, the 523 00:40:31,920 --> 00:40:34,799 Internet asks, how can we deal with the publish or perish culture? 524 00:40:34,799 --> 00:40:41,579 J: Oh, yeah. If I knew that, I would write a very smart blog posts and trying to get 525 00:40:41,579 --> 00:40:46,019 convince people about that. I think personally we need to rethink the way we 526 00:40:46,019 --> 00:40:50,109 do the funding because that's in the end where it comes down to. Another issue I 527 00:40:50,109 --> 00:40:54,100 really didn't go into much detail in the talk because it's also very complex. So 528 00:40:54,100 --> 00:40:59,880 science funding is usually defined by a decision making process; at one point 529 00:40:59,880 --> 00:41:04,810 somebody decides, who gets the money and to get the money they need a qualifier to 530 00:41:04,810 --> 00:41:09,300 decide. Like there is 10 research groups or 100 research groups said that write a 531 00:41:09,300 --> 00:41:13,309 grant and say like "Hey, we need money because we want to do research." And they 532 00:41:13,309 --> 00:41:19,490 have to figure out or they have to decide, who gets it because they can't give money 533 00:41:19,490 --> 00:41:24,099 to everyone because we spend money in our budgets on different things than just 534 00:41:24,099 --> 00:41:29,759 science. So the next best thing that they came up with, what the idea to use papers 535 00:41:29,759 --> 00:41:35,730 - the number of papers that you have - to sort of get a measurement - or the quality 536 00:41:35,730 --> 00:41:40,270 of paper that you have - to get a measurement of whether you are deserving 537 00:41:40,270 --> 00:41:44,579 of the money. And you can see how that's problematic and means that people, who are 538 00:41:44,579 --> 00:41:49,089 early in their research career, who don't have a lot of papers, they have a lower 539 00:41:49,089 --> 00:41:53,049 chance of getting the money. And that leads to publish or perish idea that if 540 00:41:53,049 --> 00:41:56,900 you don't publish your results and if you don't publish them in a very well 541 00:41:56,900 --> 00:42:01,240 respected journal, then the funding agencies won't give you money. And so you 542 00:42:01,240 --> 00:42:07,619 perish and you can't really pursue your research career. And it's really a hard 543 00:42:07,619 --> 00:42:11,730 problem to solve because the decision about the funding is very much detached 544 00:42:11,730 --> 00:42:19,060 from the scientific world, from academia. That's like multiple levels of abstraction 545 00:42:19,060 --> 00:42:23,660 between the people, who like in the end make the budgets and decide, who gets the 546 00:42:23,660 --> 00:42:29,660 money and the people, who are actually using the money. I would wish for funding 547 00:42:29,660 --> 00:42:36,850 agency to look less at papers and maybe come up with different qualifiers, maybe 548 00:42:36,850 --> 00:42:44,980 also something like general scientific practice, maybe they could do audits of 549 00:42:44,980 --> 00:42:50,980 some sort of labs. I mean, there's a ton of factors that influence good research 550 00:42:50,980 --> 00:42:57,210 that are not mentioned in papers like work ethics, work culture, how much teaching you 551 00:42:57,210 --> 00:43:01,670 do, which can be very important. But it's sort of detrimental to get more funding 552 00:43:01,670 --> 00:43:05,760 because when you do teaching, you don't do research and then you don't get papers and 553 00:43:05,760 --> 00:43:10,940 then you don't get money. So, yeah, I don't have a very good solution to the 554 00:43:10,940 --> 00:43:16,410 question what we can do. I would like to see more diverse funding also of smaller 555 00:43:16,410 --> 00:43:21,450 research groups. I would like to see more funding for negative results, which is 556 00:43:21,450 --> 00:43:28,369 another thing that we don't really value. So if you do an experiment and it doesn't 557 00:43:28,369 --> 00:43:32,430 work, you can't publish it, you don't get the paper, you don't get money and so on. 558 00:43:32,430 --> 00:43:35,180 So there are many factors that need to change, many things that we need to touch 559 00:43:35,180 --> 00:43:39,019 to actually get away from publish or perish. 560 00:43:39,019 --> 00:43:47,359 H: Yeah, another question that is closely connected to that is: Why are there so few 561 00:43:47,359 --> 00:43:52,420 stable jobs in science? J: Yeah, that's the 562 00:43:52,420 --> 00:43:56,349 Wissenschaftszeitvertragsgesetzt, something that - I forgot when we got it - 563 00:43:56,349 --> 00:44:04,099 I think in the late 90s or early 2000s. That's at least a very German specific 564 00:44:04,099 --> 00:44:14,269 answer that defined this Gesetz, this law, put it into law that you have a limited 565 00:44:14,269 --> 00:44:18,750 time span that you can work in research, you can only work in research for I think 566 00:44:18,750 --> 00:44:23,589 12 years and are some footnotes and stuff around it. But there is a fixed time limit 567 00:44:23,589 --> 00:44:27,579 that you can work in research on limited term contracts, but you're funding 568 00:44:27,579 --> 00:44:31,170 whenever you get research funding, it's always for a limited time. You always get 569 00:44:31,170 --> 00:44:36,220 research funding for three years, six years if you're lucky. So you never have 570 00:44:36,220 --> 00:44:41,019 permanent money in the research group. Sometimes you have that in universities 571 00:44:41,019 --> 00:44:44,559 but overall you don't have permanent money. And so if you don't have permanent 572 00:44:44,559 --> 00:44:49,570 money, you can't have permanent contracts and therefore there aren't really stable 573 00:44:49,570 --> 00:44:52,940 jobs. And then with professorships or some group leader positions, then it changes 574 00:44:52,940 --> 00:44:58,830 because group leaders and professorships, they are more easily planned. And 575 00:44:58,830 --> 00:45:02,289 therefore in universities and research institutes, they sort of make a long term 576 00:45:02,289 --> 00:45:07,250 budget and say "OK, we will have 15 research groups. So we have money in the 577 00:45:07,250 --> 00:45:12,810 long term for 15 group leaders.". But whoever is hired underneath these group 578 00:45:12,810 --> 00:45:16,480 leaders, this has much more fluctuation and is based on sort of short term money. 579 00:45:16,480 --> 00:45:20,529 And so there's no stable jobs there. At least that's in Germany. I know that, for 580 00:45:20,529 --> 00:45:25,859 example, in the UK and in France, they have earlier permanent position jobs. They 581 00:45:25,859 --> 00:45:29,900 have lecturers, for example, in the UK where you can without being a full 582 00:45:29,900 --> 00:45:35,300 professor that has like its own backpack of stuff that has to be done, you can 583 00:45:35,300 --> 00:45:40,259 already work at a university in the long term in a permanent contract. So it's a 584 00:45:40,259 --> 00:45:44,839 very.. it's a problem we see across the world but Germany has its own very 585 00:45:44,839 --> 00:45:50,190 specific problems introduced here that make it very unattractive to stay long 586 00:45:50,190 --> 00:45:56,530 term in research in Germany. H: It's true. I concur. 587 00:45:56,530 --> 00:46:02,589 J: Yes H *laughs* Coming to talk to the people, 588 00:46:02,589 --> 00:46:12,720 who do science mostly for fun and less for profit. This question is: Can you write 589 00:46:12,720 --> 00:46:17,530 and publish a paper without a formal degree in the sciences, assuming the 590 00:46:17,530 --> 00:46:23,680 research efforts are sufficiently good? J: Yes, I think technically it is 591 00:46:23,680 --> 00:46:27,090 possible. It comes with some problems, like, first of all, it's not free. First 592 00:46:27,090 --> 00:46:34,240 of all, when you submit your paper to a journal, you pay money for it. I don't 593 00:46:34,240 --> 00:46:39,560 know exactly but it ranges. I think the safe assumption is between 1.000 and 594 00:46:39,560 --> 00:46:44,349 5.000$, depending on the journal, where you submit to. Then very often it's like 595 00:46:44,349 --> 00:46:49,960 some formal problems that... I've been recently co-authoring a paper and I'm not 596 00:46:49,960 --> 00:46:56,509 actively doing research anymore. I did something in my spare time, helped a 597 00:46:56,509 --> 00:47:02,130 friend of mine, who was still doing research with some like basic stuff but he 598 00:47:02,130 --> 00:47:06,619 was so nice to put me on the paper. And then there is a form where it says like 599 00:47:06,619 --> 00:47:11,850 institute affiliation and I don't have an institute affiliation in that sense. So as 600 00:47:11,850 --> 00:47:16,049 I'm just a middle author in this paper, I was published - or hopefully if it gets 601 00:47:16,049 --> 00:47:19,609 accepted - I will be there as an independent researcher but it might be 602 00:47:19,609 --> 00:47:23,930 that a journal has their own internal rules where they say we only accept people 603 00:47:23,930 --> 00:47:28,201 from institutions. So it's not really inherent in the scientific system that you 604 00:47:28,201 --> 00:47:32,470 have to be at an institution but there are these doors, there are these 605 00:47:32,470 --> 00:47:38,239 pathways that are locked because somebody has to put in a form somewhere that which 606 00:47:38,239 --> 00:47:42,760 institution you affiliate with. And I know that some people, who do like DIY science, 607 00:47:42,760 --> 00:47:48,549 so they do outside of academia, that they need to have in academia partners that 608 00:47:48,549 --> 00:47:54,060 help them with the publishing and also to get access to certain things. I mean, in 609 00:47:54,060 --> 00:47:57,579 computer science, you don't need specific chemicals,but if you do anything like 610 00:47:57,579 --> 00:48:02,819 chemical engineering or biology or anything, often you only get access to the 611 00:48:02,819 --> 00:48:08,170 supplies when you are an academic institution. So, I know that many people 612 00:48:08,170 --> 00:48:13,269 have sort of these partnerships, corporations with academia that allow them 613 00:48:13,269 --> 00:48:18,540 to actually do the research and then publish it as well because otherwise, if 614 00:48:18,540 --> 00:48:23,549 you're just doing it from your own bedroom, there might be a lot of barriers 615 00:48:23,549 --> 00:48:27,490 in your way that might be very hard to overcome. But I think if you really, 616 00:48:27,490 --> 00:48:35,210 really dedicated, you can overcome them. H: Coming to the elephants in that 617 00:48:35,210 --> 00:48:41,160 bedroom: What can we do against the spread of false facts, IFG, corona- 618 00:48:41,160 --> 00:48:48,240 vaccines? So they are very.. They get a lot of likes and are spread like a disease 619 00:48:48,240 --> 00:48:56,099 themselves. And it's very hard to counter, especially in personal encounters, these 620 00:48:56,099 --> 00:49:01,609 arguments because apparently a lot of people are not that familiar with the 621 00:49:01,609 --> 00:49:04,700 scientific method. What's your take on that? 622 00:49:04,700 --> 00:49:09,329 J: Yeah, it's difficult. And I've read over the years now many different 623 00:49:09,329 --> 00:49:15,630 approaches ranging from nuts actually talking about facts because often 624 00:49:15,630 --> 00:49:18,960 somebody, who has a very predefined opinion on something, they know a lot of 625 00:49:18,960 --> 00:49:22,989 false facts that they have on their mind. And you, as somebody talking to them, 626 00:49:22,989 --> 00:49:26,160 often don't have all of the correct facts in your mind. I mean, who runs around 627 00:49:26,160 --> 00:49:31,529 with, like, a bag full of climate facts and a bag full of 5G facts and a bag full 628 00:49:31,529 --> 00:49:37,880 of vaccine facts or like in the same quantity and quality as the stuff that 629 00:49:37,880 --> 00:49:41,089 somebody, who read stuff on Facebook has in their in their backpack and their sort 630 00:49:41,089 --> 00:49:47,119 of mental image of the world. So just arguing on the facts, it's very hard 631 00:49:47,119 --> 00:49:52,670 because people, who follow these false ideas, they're very quick at making turns 632 00:49:52,670 --> 00:49:56,319 and they like throw a thing at you one after the other. And so it's really hard 633 00:49:56,319 --> 00:50:01,079 to just be like but actually debunking fact one and then debunking the next wrong 634 00:50:01,079 --> 00:50:07,859 fact. So I've seen a paper where people try to do this sort of on a argumentative 635 00:50:07,859 --> 00:50:13,239 standpoint. They say: "Look: You're drawing false conclusions. You say because 636 00:50:13,239 --> 00:50:20,820 A, therefore B, but these two things aren't linked in a causal way. So you 637 00:50:20,820 --> 00:50:25,260 can't actually draw this conclusion." And so sort of try to destroy that argument on 638 00:50:25,260 --> 00:50:31,659 a meta level instead on a fact level. But also that is difficult. And usually 639 00:50:31,659 --> 00:50:36,980 people, who are really devout followers of false facts, they are also not followers 640 00:50:36,980 --> 00:50:42,769 of reasons or any reason based argument will just not work for them because they 641 00:50:42,769 --> 00:50:51,900 will deny it. I think what really helps is a lot of small scale action in terms of 642 00:50:51,900 --> 00:50:56,779 making scientific data. So making science more accessible. And I mean, I'm a science 643 00:50:56,779 --> 00:50:59,940 communicator, so I'm heavily biased. I'm saying like we need more science 644 00:50:59,940 --> 00:51:04,789 communication, we need more low level science communication. We need to have it 645 00:51:04,789 --> 00:51:09,031 freely accessible because all of the stuff that you read with the false facts, this 646 00:51:09,031 --> 00:51:14,210 is all freely available on Facebook and so on. So we need to have a similar low 647 00:51:14,210 --> 00:51:22,189 level, low entry level for the correct facts. So for the real facts. And this is 648 00:51:22,189 --> 00:51:25,970 also.. It's hard to do. I mean, in science communication field, there's also a lot of 649 00:51:25,970 --> 00:51:31,339 debate how we do that. Should we do that over more presence on social media? Should 650 00:51:31,339 --> 00:51:38,130 we simplify more or are we then actually oversimplifying like where is the balance? 651 00:51:38,130 --> 00:51:43,819 How do we walk this line? So there's a lot of discussion and still ongoing learning 652 00:51:43,819 --> 00:51:48,130 about that. But I think in the end, it's that what we need, we need people to be 653 00:51:48,130 --> 00:51:56,746 able to just to find correct facts just as easily and understandable as they find the 654 00:51:56,746 --> 00:52:05,210 fake news and the facts. Like we need science to be communicated as clearly as a 655 00:52:05,210 --> 00:52:11,279 stupid share rolls on Facebook, as an image that - I don't want to repeat all of 656 00:52:11,279 --> 00:52:17,680 the wrong claims, but something that says something very wrong, but very persuasive. 657 00:52:17,680 --> 00:52:22,190 We need to be as persuasive with the correct facts. And I know that many people 658 00:52:22,190 --> 00:52:28,099 are doing that by now, especially on places like Instagram or TikTok. You find 659 00:52:28,099 --> 00:52:33,309 more and more people doing very high quality, low level - and I mean that on 660 00:52:33,309 --> 00:52:40,170 sort of jargon level, not on a sort of intellectual level - so very low barrier 661 00:52:40,170 --> 00:52:46,700 science communication. And I think this helps a lot. This helps more than very 662 00:52:46,700 --> 00:52:52,569 complicated sort of pages debunking false facts. I mean, we also need these we also 663 00:52:52,569 --> 00:52:56,951 need these as references. But if we really want to combat the spread of fake news, we 664 00:52:56,951 --> 00:53:01,589 need to just be as accessible with the truth. 665 00:53:01,589 --> 00:53:10,749 H: A thing closely connected to that is: "How do we find human error or detect 666 00:53:10,749 --> 00:53:16,319 it?", since I guess people, who are watching this talk have already started 667 00:53:16,319 --> 00:53:23,380 with a process of fine tuning their bullshit detectors but when, for example, 668 00:53:23,380 --> 00:53:27,010 something very exciting and promising comes along as an example, CRISPR/Cas or 669 00:53:27,010 --> 00:53:39,489 something. How do we go forward to not be fooled by our own already tuned bullshit 670 00:53:39,489 --> 00:53:46,400 detectors and fall to false conclusions. J: I think a main part of this is 671 00:53:46,400 --> 00:53:54,200 practice. Just try to look for something that would break the story, just not for 672 00:53:54,200 --> 00:53:57,829 every story that I read - that's that's a lot of work. But from time to time, pick a 673 00:53:57,829 --> 00:54:01,119 story where you're like "Oh, this is very exciting" and try to learn as much as you 674 00:54:01,119 --> 00:54:05,870 can about that one story. And by doing that, also learn about the process, how 675 00:54:05,870 --> 00:54:12,279 you drew the conclusions and then compare your final images after you did all the 676 00:54:12,279 --> 00:54:18,640 research to the thing that you read in the beginning and see where there are things 677 00:54:18,640 --> 00:54:23,010 that are not coming together and where there are things that are the same and 678 00:54:23,010 --> 00:54:30,109 then based on that, practice. And I know that that's a lot of work, so that's sort 679 00:54:30,109 --> 00:54:38,150 of the the high impact way of doing that by just practicing and just actively doing 680 00:54:38,150 --> 00:54:43,900 the check-ups. But the other way you can do this is find people whose opinion you 681 00:54:43,900 --> 00:54:51,039 trust on topics and follow them, follow them on podcasts, on social media, on 682 00:54:51,039 --> 00:54:56,579 YouTube or wherever. And, especially in the beginning when you don't know them 683 00:54:56,579 --> 00:55:01,059 well be very critical about them, it's easy to fall into like a sort of trap here 684 00:55:01,059 --> 00:55:06,339 and following somebody, who actually doesn't know their stuff. But there are 685 00:55:06,339 --> 00:55:09,970 some people, I mean, in this community here - I am not saying anything UFSA - 686 00:55:09,970 --> 00:55:16,650 if you follow people like minkorrekt, like methodisch inkorrekt, they are great for a 687 00:55:16,650 --> 00:55:19,470 very.. I actually can't really pin down which scientific area because in their 688 00:55:19,470 --> 00:55:22,740 podcast they're touching so many different things and they have a very high level 689 00:55:22,740 --> 00:55:28,599 understanding of how science works. So places like this are a good start to get a 690 00:55:28,599 --> 00:55:35,049 healthy dose of skepticism. Another rule of thumb that I can give is like usually 691 00:55:35,049 --> 00:55:40,059 stories are not as exciting when you get down to the nitty gritty details, like I'm 692 00:55:40,059 --> 00:55:45,220 a big fan of CRISPR, for example, but I don't believe that we can cure all 693 00:55:45,220 --> 00:55:49,369 diseases just now because we have CRISPR, like, there's very limited things we can 694 00:55:49,369 --> 00:55:54,829 do with it and we can do much more with it than what we could do when we didn't have 695 00:55:54,829 --> 00:56:00,549 it. But I'm not going around and thinking now we can create life at will because we 696 00:56:00,549 --> 00:56:05,849 have CRISPR. We can fight any disease at will because we have CRISPR. So that's in 697 00:56:05,849 --> 00:56:11,059 general a good rule of thumb is: just calm down, look what's really in there and see 698 00:56:11,059 --> 00:56:18,490 how much.. or tone it just down like 20% and then take that level of excitement 699 00:56:18,490 --> 00:56:22,130 with you instead of going around and being scared or overly excited about a new 700 00:56:22,130 --> 00:56:28,630 technology and you think that's been found because we rarely do these massive jumps 701 00:56:28,630 --> 00:56:34,769 that we need to start to worry or get over excited about something. 702 00:56:34,769 --> 00:56:42,520 H: Very good, so very last question: Which tools did you use to create these nice 703 00:56:42,520 --> 00:56:47,910 drawings? J: *laughs* Oh, a lot of people won't like 704 00:56:47,910 --> 00:56:53,343 me for saying this because this will sound like a product promo. But there is.. I use 705 00:56:53,343 --> 00:56:59,349 an iPad with a pencil and I used an app to draw the things on there called Affinity 706 00:56:59,349 --> 00:57:04,380 Designer because that works very well then also across device. So that's how I 707 00:57:04,380 --> 00:57:08,849 created all of the drawings and I put them all together in Apple Motion and exported 708 00:57:08,849 --> 00:57:14,649 the whole thing in Apple FinalCut. So this is now the show like a sales pitch for all 709 00:57:14,649 --> 00:57:17,329 of these products. But I can say, like for me, they work very well but there's pretty 710 00:57:17,329 --> 00:57:23,640 much alternatives for everything along the way. I mean, I can say because I'm also 711 00:57:23,640 --> 00:57:28,019 doing a lot of science communication with drawings for the Plants and Pipettes project 712 00:57:28,019 --> 00:57:33,039 that I am part of and I can say an iPad with a pencil and the finishing designer gets you 713 00:57:33,039 --> 00:57:38,530 very far for high quality drawings with a very easy access because I'm no way an 714 00:57:38,530 --> 00:57:44,940 artist. I'm very bad at this stuff. But I can hide all my shortcomings because I 715 00:57:44,940 --> 00:57:49,170 have an undo function in my iPad and because everything's in a vector drawing, 716 00:57:49,170 --> 00:57:54,140 I can delete every stroke that I made, even if I realized like an hour later that 717 00:57:54,140 --> 00:57:58,589 this should not be there, I can, like, reposition it and delete it. So vector 718 00:57:58,589 --> 00:58:03,739 files and a pencil and an undo function were my best friends in the creating of 719 00:58:03,739 --> 00:58:09,079 this video. H: Very good, derJoram. Thank you very 720 00:58:09,079 --> 00:58:14,151 much for your talk and your very extensive Q&A. I think a lot of people are very 721 00:58:14,151 --> 00:58:16,151 happy with your work. J: Thanks you. 722 00:58:16,151 --> 00:58:21,619 H: And are actually saying in the pad that you should continue communicate science to 723 00:58:21,619 --> 00:58:24,670 the public. J: That's very good because that's my job. 724 00:58:24,670 --> 00:58:27,700 *laughs* It's good that people like that. H: Perfect. 725 00:58:27,700 --> 00:58:31,529 J: Thank you very much. H: So a round of applause and some very 726 00:58:31,529 --> 00:58:39,760 final announcements for this session. There will be the Herald new show and the 727 00:58:39,760 --> 00:58:47,920 break. So stay tuned for that. And I would say if there are no further... no, we 728 00:58:47,920 --> 00:58:53,339 don't have any more time, sadly, but I guess people know how to connect to you 729 00:58:53,339 --> 00:58:59,299 and contact derJoram if they want to know anything more. 730 00:58:59,299 --> 00:59:14,869 *rC3 postroll music* 731 00:59:14,869 --> 00:59:40,000 Subtitles created by c3subtitles.de in the year 2020. Join, and help us!