1 00:00:01,650 --> 00:00:02,580 All right, hi class, 2 00:00:02,580 --> 00:00:05,534 we're back at modeling complex systems. 3 00:00:05,534 --> 00:00:08,220 In this video, what I'd really like to do 4 00:00:08,220 --> 00:00:12,450 is go over the three key words of our title. 5 00:00:12,450 --> 00:00:16,680 So models, complex systems, what did these words even mean? 6 00:00:16,680 --> 00:00:18,630 I'm sure some of you signed up for this course 7 00:00:18,630 --> 00:00:22,290 because you have to, it's a mandatory or core course 8 00:00:22,290 --> 00:00:24,300 for our complex systems program. 9 00:00:24,300 --> 00:00:26,280 I'm sure some of you maybe just joined 10 00:00:26,280 --> 00:00:27,900 because the description was interesting, 11 00:00:27,900 --> 00:00:29,910 but that doesn't mean that you have an idea 12 00:00:29,910 --> 00:00:31,170 of what we're gonna do here. 13 00:00:31,170 --> 00:00:34,110 So I want to clear up some common misconceptions 14 00:00:34,110 --> 00:00:35,550 about the course, and also just talk 15 00:00:35,550 --> 00:00:39,213 about what models and complex systems mean here. 16 00:00:41,160 --> 00:00:44,490 So there's a classic quote from Albert Einstein, 17 00:00:44,490 --> 00:00:46,680 and I'm sure a lot of you have already heard, 18 00:00:46,680 --> 00:00:48,960 things should be made as simple as possible, 19 00:00:48,960 --> 00:00:50,640 but not simpler. 20 00:00:50,640 --> 00:00:55,140 And this is really one of the key insight 21 00:00:55,140 --> 00:00:57,030 behind what we mean here by models, 22 00:00:57,030 --> 00:01:00,360 is that we try to simplify the the world, 23 00:01:00,360 --> 00:01:03,720 be it through just thinking about what's important, 24 00:01:03,720 --> 00:01:08,520 or mathematical equations or computational programs. 25 00:01:08,520 --> 00:01:10,980 The idea is just to simplify something 26 00:01:10,980 --> 00:01:13,050 in order to better understand it, right? 27 00:01:13,050 --> 00:01:15,210 So if I ask you at some point in this semester 28 00:01:15,210 --> 00:01:17,640 to model an ant colony, 29 00:01:17,640 --> 00:01:19,860 you could go and dig one in your backyard 30 00:01:19,860 --> 00:01:22,200 and then bring, you know, one of these end colonies 31 00:01:22,200 --> 00:01:24,540 in a glass box to class or you know, 32 00:01:24,540 --> 00:01:26,820 for show and tell on video. 33 00:01:26,820 --> 00:01:28,200 But that doesn't really help us 34 00:01:28,200 --> 00:01:30,750 understand how ant colonies work. 35 00:01:30,750 --> 00:01:33,120 Similarly, you know, you could take one ant 36 00:01:33,120 --> 00:01:35,940 and look at it very closely, 37 00:01:35,940 --> 00:01:38,460 but that wouldn't really help you understand 38 00:01:38,460 --> 00:01:41,910 how colonies like come to be these complex structures? 39 00:01:41,910 --> 00:01:44,490 How can ant colonies grow mushroom? 40 00:01:44,490 --> 00:01:47,670 How can they discover things, and search for food? 41 00:01:47,670 --> 00:01:50,220 It's really about their behavior as a whole. 42 00:01:50,220 --> 00:01:52,590 So there has to be a sweet spot somewhere 43 00:01:52,590 --> 00:01:55,200 between considering more than one ant, 44 00:01:55,200 --> 00:01:58,560 and also not embracing like the complexity 45 00:01:58,560 --> 00:02:00,570 and craziness of the real world. 46 00:02:00,570 --> 00:02:02,640 So we're gonna try and use computational 47 00:02:02,640 --> 00:02:04,530 and mathematical model as parables 48 00:02:04,530 --> 00:02:06,710 or cartoons for what the world is 49 00:02:06,710 --> 00:02:09,630 to help us understand the key ingredient. 50 00:02:09,630 --> 00:02:13,230 There's another quote that I think better captures 51 00:02:13,230 --> 00:02:14,190 what I mean by this, 52 00:02:14,190 --> 00:02:16,320 and I'm stealing this juxtaposition 53 00:02:16,320 --> 00:02:17,647 of two quotes from a book called 54 00:02:17,647 --> 00:02:21,540 "The Computational Beauty of Nature" by Gary William Flake. 55 00:02:21,540 --> 00:02:23,700 So the second quote is from Bertrand Russell, 56 00:02:23,700 --> 00:02:24,533 the philosopher, and he says, 57 00:02:24,533 --> 00:02:27,637 "The point of philosophy is to start with something 58 00:02:27,637 --> 00:02:30,757 "so simple as not to seem worth stating, 59 00:02:30,757 --> 00:02:33,637 "and to end with something so paradoxical 60 00:02:33,637 --> 00:02:35,367 "that no one will believe it." 61 00:02:36,570 --> 00:02:38,070 And we'll see more of that later, 62 00:02:38,070 --> 00:02:39,570 but that's really here what we mean 63 00:02:39,570 --> 00:02:41,850 by models and complex systems, 64 00:02:41,850 --> 00:02:43,290 is to try and build something 65 00:02:43,290 --> 00:02:45,240 that's simple in its construction, 66 00:02:45,240 --> 00:02:48,093 and yet help us understand the richness of the world. 67 00:02:49,890 --> 00:02:52,170 I'm not sure what I mean here by this definition, 68 00:02:52,170 --> 00:02:55,920 rule-based phenomenological mechanisms. 69 00:02:55,920 --> 00:02:59,400 But really here what I want to stress is that 70 00:02:59,400 --> 00:03:01,740 this course is about mechanistic models, 71 00:03:01,740 --> 00:03:02,940 mental representation, 72 00:03:02,940 --> 00:03:05,370 mathematical computational representation, 73 00:03:05,370 --> 00:03:07,770 but not necessarily like curve fitting. 74 00:03:07,770 --> 00:03:10,800 So there's a whole world of statistical models 75 00:03:10,800 --> 00:03:12,450 and how to deal with that. 76 00:03:12,450 --> 00:03:14,640 Here we're more concerned about the models themselves, 77 00:03:14,640 --> 00:03:16,740 and we're gonna do a little bit of model fitting, 78 00:03:16,740 --> 00:03:19,560 model selection way later in the class. 79 00:03:19,560 --> 00:03:21,630 But here it's more about the construction, 80 00:03:21,630 --> 00:03:23,850 and the key word here is mechanisms, 81 00:03:23,850 --> 00:03:26,110 are models that try to help us 82 00:03:27,030 --> 00:03:30,450 understand what drives a system is gonna be the key part, 83 00:03:30,450 --> 00:03:33,060 it's about qualitative behavior and outcomes, 84 00:03:33,060 --> 00:03:35,130 which is what I mean by Phenomenology 85 00:03:35,130 --> 00:03:36,483 in this way of definition. 86 00:03:37,980 --> 00:03:39,450 And one thing we like to say 87 00:03:39,450 --> 00:03:42,300 at the complex systems center here 88 00:03:42,300 --> 00:03:44,350 is that data science and modeling 89 00:03:45,810 --> 00:03:48,780 are sort of the microscope and the lab benches 90 00:03:48,780 --> 00:03:52,170 for this growing field of complexity science, right? 91 00:03:52,170 --> 00:03:54,690 So to drive biology, for example, 92 00:03:54,690 --> 00:03:57,420 you need to be able to observe biology in the wild 93 00:03:57,420 --> 00:03:59,700 maybe with a microscope in all its glory, 94 00:03:59,700 --> 00:04:01,890 cells fighting each other, 95 00:04:01,890 --> 00:04:03,960 but you also need to be able to do experiments 96 00:04:03,960 --> 00:04:06,570 in a controlled setting on lab benches. 97 00:04:06,570 --> 00:04:07,680 And data science is 98 00:04:07,680 --> 00:04:10,140 how we observe the world in complexity science, 99 00:04:10,140 --> 00:04:12,330 and modeling is sort of how we do experiments. 100 00:04:12,330 --> 00:04:16,410 So the models becomes in silico or silicone laboratories 101 00:04:16,410 --> 00:04:18,930 where we can run our own control settings 102 00:04:18,930 --> 00:04:20,760 and try to figure out under what condition 103 00:04:20,760 --> 00:04:22,413 something interesting happens. 104 00:04:24,150 --> 00:04:26,790 If you're more of a physicist, 105 00:04:26,790 --> 00:04:28,410 maybe you instead of a microscope, 106 00:04:28,410 --> 00:04:29,910 you can think of a telescope, 107 00:04:29,910 --> 00:04:32,880 data science is how we observe deep space. 108 00:04:32,880 --> 00:04:33,990 And then modeling would be 109 00:04:33,990 --> 00:04:36,180 something like our particle accelerator, right? 110 00:04:36,180 --> 00:04:37,710 But still something more controlled 111 00:04:37,710 --> 00:04:40,530 where we can run experiments in our head, 112 00:04:40,530 --> 00:04:43,230 or on our computer, or with pen and paper. 113 00:04:43,230 --> 00:04:46,440 But that's the role of modeling is to allow us 114 00:04:46,440 --> 00:04:48,780 to do these experiments. 115 00:04:48,780 --> 00:04:52,410 And before I move on to complex systems, 116 00:04:52,410 --> 00:04:55,440 I just wanna say that modeling in general for me 117 00:04:55,440 --> 00:04:57,090 was sort of a real revelation 118 00:04:57,090 --> 00:05:00,720 because it's this deeply romantic tool 119 00:05:00,720 --> 00:05:03,750 because it allows you to do science wherever you want it. 120 00:05:03,750 --> 00:05:08,280 You can be by a fire on the beach of Lake Champlain, 121 00:05:08,280 --> 00:05:11,490 or you can be walking down a railroad track at night. 122 00:05:11,490 --> 00:05:12,990 An idea comes into your head, 123 00:05:12,990 --> 00:05:14,220 you're curious about the system. 124 00:05:14,220 --> 00:05:15,270 You can start thinking 125 00:05:15,270 --> 00:05:17,430 once you've trained yourself with modeling 126 00:05:17,430 --> 00:05:19,230 about what are the key parts of that system, 127 00:05:19,230 --> 00:05:20,850 what are the key mechanisms, 128 00:05:20,850 --> 00:05:23,160 and start running experiment in your head. 129 00:05:23,160 --> 00:05:27,270 And one of the reason I say that this is a romantic tool 130 00:05:27,270 --> 00:05:28,890 is because once you master it, 131 00:05:28,890 --> 00:05:30,630 it allows you to follow your heart 132 00:05:30,630 --> 00:05:31,950 and follow your curiosity, 133 00:05:31,950 --> 00:05:34,980 and tackle any questions that you think is deep, 134 00:05:34,980 --> 00:05:39,150 and important, and follow your curiosity 135 00:05:39,150 --> 00:05:40,140 throughout different fields. 136 00:05:40,140 --> 00:05:42,210 So it really brings scientists together 137 00:05:42,210 --> 00:05:45,210 and gives us a common language. 138 00:05:45,210 --> 00:05:48,390 And the idea of common language is also at the core 139 00:05:48,390 --> 00:05:51,180 of what we call complex systems. 140 00:05:51,180 --> 00:05:53,850 So the classic definition goes as follows. 141 00:05:53,850 --> 00:05:56,100 Complex systems can sometimes behave in ways 142 00:05:56,100 --> 00:05:58,020 that are entirely unpredictable. 143 00:05:58,020 --> 00:06:00,810 The human brain, for example, might be described 144 00:06:00,810 --> 00:06:02,730 in terms of cellular functions 145 00:06:02,730 --> 00:06:05,682 and neurochemical interactions, 146 00:06:05,682 --> 00:06:09,870 but that description does not explain human consciousness, 147 00:06:09,870 --> 00:06:13,590 a capacity that far exceeds simple neural functions. 148 00:06:13,590 --> 00:06:16,800 Consciousness is an emergent property. 149 00:06:16,800 --> 00:06:18,420 In other words, something that's more than just 150 00:06:18,420 --> 00:06:20,003 -the sum of its parts. -Exac. 151 00:06:21,960 --> 00:06:23,790 The whole is more than the sum of its parts. 152 00:06:23,790 --> 00:06:25,500 If you're in this class, chances are 153 00:06:25,500 --> 00:06:26,910 you've heard this sentence already, 154 00:06:26,910 --> 00:06:29,733 that's the classic definition of a complex systems. 155 00:06:31,020 --> 00:06:36,020 I really like the way this is illustrated by Peter Dodds, 156 00:06:37,860 --> 00:06:41,160 in this website or online textbook 157 00:06:41,160 --> 00:06:42,900 almost called "Complexity Explained." 158 00:06:42,900 --> 00:06:45,120 So there's the address in the bottom left of the screen, 159 00:06:45,120 --> 00:06:47,280 and I invite you all to go check it out. 160 00:06:47,280 --> 00:06:49,230 Peter, the director of our center, 161 00:06:49,230 --> 00:06:51,127 has this great quote on there, 162 00:06:51,127 --> 00:06:52,837 "There's no love in a carbon atom, 163 00:06:52,837 --> 00:06:54,937 "no hurricane in a water molecule, 164 00:06:54,937 --> 00:06:57,627 "no financial collapse in a dollar bill." 165 00:06:58,710 --> 00:07:01,020 And really, basically in all of these sentence, 166 00:07:01,020 --> 00:07:02,880 love, hurricane, and financial collapse 167 00:07:02,880 --> 00:07:07,143 would be what Jordi described as emergent property, 168 00:07:08,100 --> 00:07:11,160 and it's this idea that you can't study the whole 169 00:07:11,160 --> 00:07:15,153 by only studying individual parts of the system. 170 00:07:16,710 --> 00:07:18,480 Murray Gell-Mann, one of the founders 171 00:07:18,480 --> 00:07:19,830 of the Santa Fe Institute 172 00:07:19,830 --> 00:07:22,650 and of complexity science as we know it today 173 00:07:22,650 --> 00:07:25,230 originally proposed that this area of research 174 00:07:25,230 --> 00:07:27,060 should be called PLECTICS. 175 00:07:27,060 --> 00:07:31,950 and PLECTICS meaning woven, interwoven, or interconnected, 176 00:07:31,950 --> 00:07:33,753 or more like unseparable. 177 00:07:34,770 --> 00:07:37,200 And I really like this original proposal, 178 00:07:37,200 --> 00:07:40,200 the word itself is a little pretentious, maybe. 179 00:07:40,200 --> 00:07:42,990 But what I do like about it, it's not about the systems, 180 00:07:42,990 --> 00:07:46,470 it's about the philosophy with which we tackle the systems. 181 00:07:46,470 --> 00:07:50,640 So when you start calling yourself 182 00:07:50,640 --> 00:07:52,320 a complex system scientist, 183 00:07:52,320 --> 00:07:54,780 you might get some pushback from other scientists 184 00:07:54,780 --> 00:07:56,940 because it sounds like everyone else 185 00:07:56,940 --> 00:07:58,350 is studying simple systems, 186 00:07:58,350 --> 00:07:59,790 and that's just not the case, right? 187 00:07:59,790 --> 00:08:02,730 Political scientists, evolutionary biologists, 188 00:08:02,730 --> 00:08:06,780 immunologists, ecologist, most fields of science 189 00:08:06,780 --> 00:08:08,940 are studying complex systems. 190 00:08:08,940 --> 00:08:10,890 So what do we mean when we use that word? 191 00:08:10,890 --> 00:08:14,760 Is that we embrace the fact that all of these fields, 192 00:08:14,760 --> 00:08:17,460 all of these systems that we care about 193 00:08:17,460 --> 00:08:22,140 actually interconnected, and you can't separate them, 194 00:08:22,140 --> 00:08:25,110 they're Plectics, interwoven. 195 00:08:25,110 --> 00:08:26,880 So then it's more about the philosophy 196 00:08:26,880 --> 00:08:29,520 with which you tackle these systems, 197 00:08:29,520 --> 00:08:31,170 and then it's about signaling 198 00:08:31,170 --> 00:08:33,450 that you're looking for a common language. 199 00:08:33,450 --> 00:08:36,780 And modeling is one of these common languages, 200 00:08:36,780 --> 00:08:41,163 and using complex systems is a common language to do that. 201 00:08:42,510 --> 00:08:45,540 So that's what we're gonna be trying to do 202 00:08:45,540 --> 00:08:46,680 through modeling here, 203 00:08:46,680 --> 00:08:48,810 but often through data science as well, 204 00:08:48,810 --> 00:08:51,210 is how different fields can come together 205 00:08:51,210 --> 00:08:53,673 to do transdisciplinary research. 206 00:08:54,660 --> 00:08:57,450 So just to like sum that up very briefly, right? 207 00:08:57,450 --> 00:09:00,120 When I say modeling here, I just mean representing, 208 00:09:00,120 --> 00:09:02,400 it's not about fitting a line or forecast, 209 00:09:02,400 --> 00:09:04,080 doesn't have to be mathematical, 210 00:09:04,080 --> 00:09:07,200 sometimes it's just gonna be a cartoon putting, 211 00:09:07,200 --> 00:09:08,790 you know, things in boxes, 212 00:09:08,790 --> 00:09:11,700 sometimes it's just gonna be a mental representation, 213 00:09:11,700 --> 00:09:13,290 it's about representing. 214 00:09:13,290 --> 00:09:15,900 By complex we mean inseparable, 215 00:09:15,900 --> 00:09:18,360 things that are interwoven, interconnected. 216 00:09:18,360 --> 00:09:20,786 And by systems we just mean wholes, right? 217 00:09:20,786 --> 00:09:22,830 We just mean that, we have a big picture perspective 218 00:09:22,830 --> 00:09:24,900 of whatever it is that we care about. 219 00:09:24,900 --> 00:09:28,860 And that's broad, but that's what this course is about. 220 00:09:28,860 --> 00:09:33,060 So it is a complex business 221 00:09:33,060 --> 00:09:36,003 that we're trying to tackle here. 222 00:09:37,140 --> 00:09:39,240 It is one of our other courses, 223 00:09:39,240 --> 00:09:42,360 I invite you to look at, you know, 224 00:09:42,360 --> 00:09:45,060 principles of complex systems, our data science courses 225 00:09:45,060 --> 00:09:46,620 if you want to get a better picture 226 00:09:46,620 --> 00:09:48,360 of the different set of tools 227 00:09:48,360 --> 00:09:52,143 that define this common language of complexity science.