WEBVTT 1 00:00:00.510 --> 00:00:01.410 Hey, everybody. 2 00:00:01.410 --> 00:00:03.270 As promised, we're gonna talk 3 00:00:03.270 --> 00:00:05.790 about pre-attentive processing. 4 00:00:05.790 --> 00:00:09.090 And as you can see here on this slide, 5 00:00:09.090 --> 00:00:10.800 one of the most important things 6 00:00:10.800 --> 00:00:13.440 I can teach you in this entire class, 7 00:00:13.440 --> 00:00:14.970 some of this, and probably all of this is gonna 8 00:00:14.970 --> 00:00:16.713 be mostly new information for you, 9 00:00:17.640 --> 00:00:22.640 but this should affect every single design decision 10 00:00:23.700 --> 00:00:27.480 you make from here forward, not just in this class. 11 00:00:27.480 --> 00:00:31.020 Every time you create any visual of any kind for anybody, 12 00:00:31.020 --> 00:00:33.960 you should be thinking about pre-attentive processing. 13 00:00:33.960 --> 00:00:34.980 But no pressure, right? 14 00:00:34.980 --> 00:00:36.570 Okay, so let's talk about this. 15 00:00:36.570 --> 00:00:38.910 I'm gonna show you an image really fast 16 00:00:38.910 --> 00:00:40.920 and I want you to look at the screen, 17 00:00:40.920 --> 00:00:42.360 I'm gonna count down so you don't miss it, 18 00:00:42.360 --> 00:00:43.440 and think about what you see. 19 00:00:43.440 --> 00:00:45.197 Okay, ready, here we go. 20 00:00:45.197 --> 00:00:46.470 Three, two, one, there it is. 21 00:00:46.470 --> 00:00:47.880 And that wasn't even fast enough. 22 00:00:47.880 --> 00:00:49.500 If I ask you what you see, you're gonna say, 23 00:00:49.500 --> 00:00:51.203 A jumbled lines, I don't know, 24 00:00:51.203 --> 00:00:52.036 I couldn't make sense of it, right. 25 00:00:52.036 --> 00:00:53.190 Now, if I do that again, 26 00:00:53.190 --> 00:00:56.220 but now I apply design to that same image, 27 00:00:56.220 --> 00:00:57.270 let's see what you see now. 28 00:00:57.270 --> 00:00:58.920 Ready, here we go. 29 00:00:58.920 --> 00:01:00.150 Three, two, one, there it is. 30 00:01:00.150 --> 00:01:01.620 Now, what did you see, right. 31 00:01:01.620 --> 00:01:02.580 I designed it. 32 00:01:02.580 --> 00:01:04.860 I didn't remove anything, but by changing the color, 33 00:01:04.860 --> 00:01:06.690 making it thicker, making it brighter, 34 00:01:06.690 --> 00:01:09.090 I begged you, almost forced you to see 35 00:01:09.090 --> 00:01:11.280 the one line that matters, my silhouette 36 00:01:11.280 --> 00:01:13.770 for this set of lines, for this data. 37 00:01:13.770 --> 00:01:16.159 So I can still leave all the other data there, 38 00:01:16.159 --> 00:01:17.670 all the other lines, you know, give you that context, 39 00:01:17.670 --> 00:01:19.520 but I can really draw your attention 40 00:01:19.520 --> 00:01:20.762 to the thing that really matters. 41 00:01:20.762 --> 00:01:22.170 So I'm doing this because I know 42 00:01:22.170 --> 00:01:24.960 what your pre-attentive response is going to be. 43 00:01:24.960 --> 00:01:26.400 What do I mean by pre-attentive response, 44 00:01:26.400 --> 00:01:27.690 pre-attentive processing? 45 00:01:27.690 --> 00:01:29.010 Here's the thing. 46 00:01:29.010 --> 00:01:31.140 Every single visual experience you have 47 00:01:31.140 --> 00:01:34.080 in your life occurs pre-attentively, 48 00:01:34.080 --> 00:01:36.870 meaning before you're paying attention, 49 00:01:36.870 --> 00:01:39.390 subconsciously, very quickly. 50 00:01:39.390 --> 00:01:42.210 So every visual experience is like this. 51 00:01:42.210 --> 00:01:44.820 So this morning, when you walked into, let's say, 52 00:01:44.820 --> 00:01:47.520 a classroom or your kitchen to get a cup of coffee, 53 00:01:47.520 --> 00:01:49.350 at that moment you had an immediate 54 00:01:49.350 --> 00:01:52.020 subconscious understanding of the size of the space. 55 00:01:52.020 --> 00:01:53.610 How many people were already there before you, 56 00:01:53.610 --> 00:01:55.470 where there was room to walk around. 57 00:01:55.470 --> 00:01:57.300 Every visual experience occurs this way. 58 00:01:57.300 --> 00:01:58.290 Same thing when you look at a chart. 59 00:01:58.290 --> 00:01:59.640 Yes, when you look at a chart, 60 00:01:59.640 --> 00:02:02.130 subconsciously, instantaneously, pre-attentively, 61 00:02:02.130 --> 00:02:03.900 you see that this bar is way taller than that bar. 62 00:02:03.900 --> 00:02:07.170 You're judging it before you're even thinking about it. 63 00:02:07.170 --> 00:02:09.030 This all comes from the field of gestalt psychology. 64 00:02:09.030 --> 00:02:11.190 And there are some gestalt principles at play here 65 00:02:11.190 --> 00:02:13.350 that help us understand this a little bit more. 66 00:02:13.350 --> 00:02:15.990 One of them is called figure ground. 67 00:02:15.990 --> 00:02:18.900 The basic idea here, all human beings will 68 00:02:18.900 --> 00:02:22.110 subconsciously, pre-attentively see the figure, 69 00:02:22.110 --> 00:02:23.430 meaning the object in the foreground, 70 00:02:23.430 --> 00:02:26.160 in this case, the logo, and the ground, 71 00:02:26.160 --> 00:02:27.960 meaning the background, the negative space. 72 00:02:27.960 --> 00:02:30.270 So the FedEx logo is a very famous example 73 00:02:30.270 --> 00:02:32.310 of figure ground being leveraged. 74 00:02:32.310 --> 00:02:34.680 Now, I bet half of you already know the secret 75 00:02:34.680 --> 00:02:36.810 in the FedEx logo, and half of you probably don't. 76 00:02:36.810 --> 00:02:39.120 All of us who know it were told it. 77 00:02:39.120 --> 00:02:41.562 So don't think that you should have seen this on your own. 78 00:02:41.562 --> 00:02:42.560 It's okay. 79 00:02:42.560 --> 00:02:44.310 No judgment whatsoever, I was taught this. 80 00:02:44.310 --> 00:02:45.810 But look, between the orange E 81 00:02:45.810 --> 00:02:48.570 and the orange X and what do you see? 82 00:02:48.570 --> 00:02:49.950 Yeah, FedEx is a shipping company. 83 00:02:49.950 --> 00:02:51.579 It's about movement. 84 00:02:51.579 --> 00:02:53.100 That arrow is not a coincidence, it's not a mistake. 85 00:02:53.100 --> 00:02:54.816 It's there for a reason. 86 00:02:54.816 --> 00:02:55.830 Figure ground is used all the time in logo design 87 00:02:55.830 --> 00:02:59.070 to reinforce a brand idea, okay. 88 00:02:59.070 --> 00:03:01.620 So figure ground, negative space, positive, 89 00:03:01.620 --> 00:03:03.824 you know, the actual figure, the object in the foreground, 90 00:03:03.824 --> 00:03:04.657 what does it have to do with data visualization? 91 00:03:04.657 --> 00:03:07.350 Well, here is what it has to do with data visualization. 92 00:03:07.350 --> 00:03:09.390 Humans live on earth, Earth has gravity. 93 00:03:09.390 --> 00:03:12.450 The mountains are down here, and the sky is up above. 94 00:03:12.450 --> 00:03:16.620 So we naturally think that this stuff down here is the data, 95 00:03:16.620 --> 00:03:18.510 and that the stuff above is the background. 96 00:03:18.510 --> 00:03:20.159 Whether it's white and black 97 00:03:20.159 --> 00:03:20.992 or black and white doesn't matter. 98 00:03:20.992 --> 00:03:22.620 This is one of the reasons why we expect 99 00:03:22.620 --> 00:03:24.360 low numbers to be at the bottom of a chart 100 00:03:24.360 --> 00:03:26.070 and high numbers to be at the top of a chart. 101 00:03:26.070 --> 00:03:28.800 Because things do go up like that, okay. 102 00:03:28.800 --> 00:03:30.780 By the way, another thing to be conscious of, 103 00:03:30.780 --> 00:03:31.710 we also expect low numbers 104 00:03:31.710 --> 00:03:33.990 to the left and high numbers to the right. 105 00:03:33.990 --> 00:03:36.360 We also expect bad stuff to the left and down 106 00:03:36.360 --> 00:03:39.360 and good stuff right and to the up, just so you know. 107 00:03:39.360 --> 00:03:40.620 Anyway, figure ground. 108 00:03:40.620 --> 00:03:41.880 Not the most important gestalt principle, 109 00:03:41.880 --> 00:03:44.378 not the strongest pre-attentive trigger, 110 00:03:44.378 --> 00:03:46.158 which is why we all have to learn about the FedEx logo. 111 00:03:46.158 --> 00:03:47.940 We don't see it necessarily on our own. 112 00:03:47.940 --> 00:03:51.870 A much, much more powerful pre-attentive trigger, 113 00:03:51.870 --> 00:03:54.240 gestalt principle, is the principle of proximity. 114 00:03:54.240 --> 00:03:56.070 Every human being on earth will tell you 115 00:03:56.070 --> 00:03:57.150 the dots on the left are one group, 116 00:03:57.150 --> 00:03:58.956 the dots on the right are different group, 117 00:03:58.956 --> 00:03:59.789 they're different from each other. 118 00:03:59.789 --> 00:04:02.700 Even though these two dots share proximity, doesn't matter. 119 00:04:02.700 --> 00:04:04.590 Because this dot, even though it's far away from this dot, 120 00:04:04.590 --> 00:04:06.480 the entire group has proximity 121 00:04:06.480 --> 00:04:07.950 and lacks proximity with this group. 122 00:04:07.950 --> 00:04:09.690 We see this subconsciously, pre-attentively. 123 00:04:09.690 --> 00:04:11.100 We cannot help it. 124 00:04:11.100 --> 00:04:14.400 So position of objects we are very good at pre-attentively 125 00:04:14.400 --> 00:04:17.760 seeing and essentially detecting variance. 126 00:04:17.760 --> 00:04:19.140 This is why scatter plots work so well. 127 00:04:19.140 --> 00:04:20.775 There's a bunch of dots over here, 128 00:04:20.775 --> 00:04:21.780 and there's one weird one over there, outlier, right. 129 00:04:21.780 --> 00:04:23.070 The dots are all clustered together on this line. 130 00:04:23.070 --> 00:04:25.530 I see that pattern partially because of proximity. 131 00:04:25.530 --> 00:04:27.480 Another one is the principle of similarity. 132 00:04:27.480 --> 00:04:29.220 Even though this dot and this dot, 133 00:04:29.220 --> 00:04:31.740 they're right next to each other, they have proximity. 134 00:04:31.740 --> 00:04:34.770 But because of similarity, in this case color, 135 00:04:34.770 --> 00:04:36.210 I assume this dot goes with this dot, 136 00:04:36.210 --> 00:04:37.230 and this dot goes with without dot. 137 00:04:37.230 --> 00:04:39.270 Subconsciously, pre-attentively, I see it that way. 138 00:04:39.270 --> 00:04:40.920 Can't change that, okay. 139 00:04:40.920 --> 00:04:42.630 There's also, this is by the way, why we use color 140 00:04:42.630 --> 00:04:44.760 and or shape and or contrast 141 00:04:44.760 --> 00:04:47.040 to show categories, groupings of objects. 142 00:04:47.040 --> 00:04:48.570 Okay, again, scatter pot with a thousand dots. 143 00:04:48.570 --> 00:04:49.890 Some are black, some are white. 144 00:04:49.890 --> 00:04:52.157 I'm gonna detect the two groups, right. 145 00:04:52.157 --> 00:04:53.718 That's the whole goal, when I look at a scatter plot 146 00:04:53.718 --> 00:04:54.551 with two categories depicted that way. 147 00:04:54.551 --> 00:04:56.400 there's a really interesting one called parallelism. 148 00:04:56.400 --> 00:04:58.530 Everyone on earth will tell you that the three 149 00:04:58.530 --> 00:05:01.517 parallel lines go together and the other ones are different. 150 00:05:01.517 --> 00:05:02.350 Even a six month old baby who has not 151 00:05:02.350 --> 00:05:03.960 taken geometry yet will tell you that, okay? 152 00:05:03.960 --> 00:05:05.280 This is why line charts work. 153 00:05:05.280 --> 00:05:06.750 I see lines and they're not parallel at all. 154 00:05:06.750 --> 00:05:08.376 These are different. 155 00:05:08.376 --> 00:05:09.209 They're going in opposite directions. 156 00:05:09.209 --> 00:05:11.101 Oh, no, no, look how closely they're tracking together. 157 00:05:11.101 --> 00:05:13.478 The line is telling me what the pattern in the data is. 158 00:05:13.478 --> 00:05:15.420 Last but not least is the principle of common fate. 159 00:05:15.420 --> 00:05:17.100 No proximity, no similarity, no nothing to tell 160 00:05:17.100 --> 00:05:18.630 me what's going on with these data points 161 00:05:18.630 --> 00:05:21.755 until I see that some of them share a common fate 162 00:05:21.755 --> 00:05:23.250 through, in this case, animation. 163 00:05:23.250 --> 00:05:25.860 So there's a limited set of things that we tend to do 164 00:05:25.860 --> 00:05:28.020 in data visualization to reveal 165 00:05:28.020 --> 00:05:29.970 trends, patterns and outliers to our audience. 166 00:05:29.970 --> 00:05:31.290 Some of these are stronger 167 00:05:31.290 --> 00:05:33.060 pre-attentive triggers than others. 168 00:05:33.060 --> 00:05:35.463 Position, a very powerful pre-attentive trigger, 169 00:05:36.396 --> 00:05:37.710 because of the gestalt principle of proximity. 170 00:05:37.710 --> 00:05:40.350 Size, very good, although less so with circles 171 00:05:40.350 --> 00:05:41.670 than with other objects like squares. 172 00:05:41.670 --> 00:05:43.770 Length, very good, orientation, very good. 173 00:05:43.770 --> 00:05:45.499 Others of these not so good, 174 00:05:45.499 --> 00:05:46.332 and there's a lot of nuance around this. 175 00:05:46.332 --> 00:05:47.550 I'm not gonna get into it in this lesson. 176 00:05:47.550 --> 00:05:50.119 When you read Tamara Munzner's chapter, 177 00:05:50.119 --> 00:05:51.159 you'll learn more about that. 178 00:05:51.159 --> 00:05:52.157 But very helpful, like I said before, 179 00:05:52.157 --> 00:05:53.173 to help us pick charts. 180 00:05:53.173 --> 00:05:54.416 Now, a little bit more information behind this, 181 00:05:54.416 --> 00:05:55.419 some research behind this stuff 182 00:05:55.419 --> 00:05:56.252 that really helps drive it home. 183 00:05:56.252 --> 00:05:57.480 This is the study that I mentioned earlier, 184 00:05:57.480 --> 00:05:58.980 where we learned, one of the studies, where we learned 185 00:05:58.980 --> 00:06:01.080 that humans look in the center of an image first. 186 00:06:01.080 --> 00:06:02.340 These are eye tracking studies. 187 00:06:02.340 --> 00:06:04.440 And if you have human recognizable objects in your graphic 188 00:06:04.440 --> 00:06:06.420 like dinosaurs, people will look there next. 189 00:06:06.420 --> 00:06:08.310 So lucky you, if you're making infographics about dinosaurs. 190 00:06:08.310 --> 00:06:10.800 The rest of us stuck in PowerPoint land on the right, 191 00:06:10.800 --> 00:06:12.420 creating charts and graphs with text, 192 00:06:12.420 --> 00:06:14.000 the eye's all over the place, we don't know where to look. 193 00:06:14.000 --> 00:06:15.210 That's okay. 194 00:06:15.210 --> 00:06:17.519 But what we learned here is that there's 195 00:06:17.519 --> 00:06:18.480 that fixation bias towards the center. 196 00:06:18.480 --> 00:06:20.490 Therefore, it's all about the glance. 197 00:06:20.490 --> 00:06:21.630 What do we see at a glance that's gonna 198 00:06:21.630 --> 00:06:23.340 help us know where to look, to investigate 199 00:06:23.340 --> 00:06:24.900 in this thing that we're looking at? 200 00:06:24.900 --> 00:06:27.690 So given that, we can learn from a separate study, 201 00:06:27.690 --> 00:06:29.640 which says if it's all about the glance, then you know what, 202 00:06:29.640 --> 00:06:31.410 it must be all about your peripheral vision. 203 00:06:31.410 --> 00:06:34.410 This study took a bunch of infographics like this one. 204 00:06:34.410 --> 00:06:36.240 And they ran it through a simulation 205 00:06:36.240 --> 00:06:38.340 of what you would see in your peripheral vision. 206 00:06:38.340 --> 00:06:40.380 So imagine if you were looking at this red leaf 207 00:06:40.380 --> 00:06:42.300 right in the center of this infographic. 208 00:06:42.300 --> 00:06:43.980 If you were, this is what you would 209 00:06:43.980 --> 00:06:45.383 see in your peripheral vision. 210 00:06:46.277 --> 00:06:47.610 So the first lesson, your peripheral vision sucks, okay. 211 00:06:47.610 --> 00:06:49.080 Second lesson, there's a case 212 00:06:49.080 --> 00:06:51.300 for using regular monochrome boxes. 213 00:06:51.300 --> 00:06:52.860 This is how the researchers phrased it. 214 00:06:52.860 --> 00:06:54.537 What does that mean? 215 00:06:54.537 --> 00:06:56.400 Well, the brown blobs full of content, 216 00:06:56.400 --> 00:06:57.600 I still see they were there. 217 00:06:57.600 --> 00:06:59.130 So I might look at those. 218 00:06:59.130 --> 00:07:00.210 So you can see how this works, right? 219 00:07:00.210 --> 00:07:01.440 There's also the case against using 220 00:07:01.440 --> 00:07:03.450 irregularly placed decorative elements. 221 00:07:03.450 --> 00:07:05.161 All the crap at the top, 222 00:07:05.161 --> 00:07:06.570 all the junk at the bottom, it just blows up. 223 00:07:06.570 --> 00:07:08.278 It doesn't mean don't use those. 224 00:07:08.278 --> 00:07:09.377 No, it's fine, use those, they're pretty, 225 00:07:09.377 --> 00:07:10.770 they're nice, whatever, but I may not notice 226 00:07:10.770 --> 00:07:12.180 them in my peripheral vision, therefore they're not 227 00:07:12.180 --> 00:07:13.620 going to draw my attention. 228 00:07:13.620 --> 00:07:15.221 That's what this is all about. 229 00:07:15.221 --> 00:07:16.350 Is it going to draw attention, yes or no. 230 00:07:16.350 --> 00:07:19.200 So, by the way, I'm gonna have a little moment here, 231 00:07:19.200 --> 00:07:21.390 a little bit of an aside, and teach you all 232 00:07:21.390 --> 00:07:24.000 the biggest secret in all of design. 233 00:07:24.000 --> 00:07:26.760 What makes this infographic so God awful? 234 00:07:26.760 --> 00:07:29.878 I hate this thing, I despise it, it's horrible, it's ugly. 235 00:07:29.878 --> 00:07:31.470 It makes me sad, why. 236 00:07:31.470 --> 00:07:32.910 Now, some people will say, 237 00:07:32.910 --> 00:07:33.990 well the color scheme's kind of gross. 238 00:07:33.990 --> 00:07:36.390 Yeah, I hate this color scheme, but that's not it. 239 00:07:36.390 --> 00:07:39.690 Some people may say the type color is kind of blends in. 240 00:07:39.690 --> 00:07:41.760 Yeah, terrible typography in a hundred different ways. 241 00:07:41.760 --> 00:07:43.380 Hate it, but that's not it. 242 00:07:43.380 --> 00:07:44.520 You know what the number one offense 243 00:07:44.520 --> 00:07:46.890 is of this disgusting piece of design? 244 00:07:46.890 --> 00:07:48.810 It's the alignment. 245 00:07:48.810 --> 00:07:51.450 Giant gap here, practically no gap here. 246 00:07:51.450 --> 00:07:53.340 Medium size, smaller size. 247 00:07:53.340 --> 00:07:54.630 The left hand edges of all these things 248 00:07:54.630 --> 00:07:55.800 aren't lined up with each other. 249 00:07:55.800 --> 00:07:57.450 Like there is literally zero logic. 250 00:07:57.450 --> 00:07:59.220 It's like they threw a dart at the stupid computer 251 00:07:59.220 --> 00:08:00.840 to decide where the different things should go. 252 00:08:00.840 --> 00:08:02.820 This is right aligned text over this stuff. 253 00:08:02.820 --> 00:08:03.810 This is centered. 254 00:08:03.810 --> 00:08:06.519 This is neither centered nor right aligned nor left. 255 00:08:06.519 --> 00:08:07.530 Like what on earth is going on here? 256 00:08:07.530 --> 00:08:08.760 This is a header, nutrition facts. 257 00:08:08.760 --> 00:08:10.590 These are subheads of that. 258 00:08:10.590 --> 00:08:12.719 No type art, everything's terrible, 259 00:08:12.719 --> 00:08:14.658 but especially the alignment, okay. 260 00:08:14.658 --> 00:08:16.470 Even though, if you did not know that consciously, 261 00:08:16.470 --> 00:08:18.450 subconsciously, this drives you crazy too. 262 00:08:18.450 --> 00:08:20.178 Promise me, I promise you. 263 00:08:20.178 --> 00:08:21.330 So trust me, alignment. 264 00:08:21.330 --> 00:08:24.063 Simply line stuff up and your designs will be better. 265 00:08:24.962 --> 00:08:25.920 All right, so back to pre-attentive 266 00:08:25.920 --> 00:08:27.150 processing and this research. 267 00:08:27.150 --> 00:08:29.010 So in this example, from the same research study, 268 00:08:29.010 --> 00:08:30.120 the software is focused at the center 269 00:08:30.120 --> 00:08:32.100 of this radial diagram. Keep an eye on the people icons 270 00:08:32.100 --> 00:08:33.480 and these shapes down here. 271 00:08:33.480 --> 00:08:35.310 People icons are slightly closer to the center 272 00:08:35.310 --> 00:08:36.540 so they should hold up better. 273 00:08:36.540 --> 00:08:38.670 Oops, no, they do not. 274 00:08:38.670 --> 00:08:39.780 Does this mean don't use people icons? 275 00:08:39.780 --> 00:08:42.690 No, it just means that this is why well-defined 276 00:08:42.690 --> 00:08:44.550 cohesive graphic shapes are used so frequently. 277 00:08:44.550 --> 00:08:46.470 Those simple shapes below have nice 278 00:08:46.470 --> 00:08:47.640 solid edges and are simple. 279 00:08:47.640 --> 00:08:48.750 They hold up better, your peripheral vision 280 00:08:48.750 --> 00:08:50.400 struggles with weird shapes. 281 00:08:50.400 --> 00:08:52.770 So use people icons, I love icons, go for it. 282 00:08:52.770 --> 00:08:54.060 But if I really wanted to draw attention 283 00:08:54.060 --> 00:08:56.700 to those icons next, like first essentially, 284 00:08:56.700 --> 00:08:57.780 I would put them in a block of color, 285 00:08:57.780 --> 00:08:59.520 the lesson we learned from the previous example, right. 286 00:08:59.520 --> 00:09:02.109 And knock out the icons in a background color, right. 287 00:09:02.109 --> 00:09:04.110 So last but not least, in this case, 288 00:09:04.110 --> 00:09:06.180 the software is focused at the center of the bar chart. 289 00:09:06.180 --> 00:09:08.190 Look at the Twitter and the Facebook logos. 290 00:09:08.190 --> 00:09:10.496 Yeah, Twitter logo's slightly further away, 291 00:09:10.496 --> 00:09:11.329 but not that much further away. 292 00:09:11.329 --> 00:09:12.540 Once again, it's the weird shape. 293 00:09:12.540 --> 00:09:14.520 That's why so many logos are squares 294 00:09:14.520 --> 00:09:16.680 with a logo knocked out in the background color. 295 00:09:16.680 --> 00:09:17.850 By the way, I don't know how many of you have 296 00:09:17.850 --> 00:09:20.130 ever heard the acronym BAN in data visualization? 297 00:09:20.130 --> 00:09:22.230 It's a very important technical acronym, B-A-N. 298 00:09:22.230 --> 00:09:24.120 It stands for a big-ass number, okay. 299 00:09:24.120 --> 00:09:26.700 Sometimes it's okay to use a big-ass number, 300 00:09:26.700 --> 00:09:28.260 a BAN, on your designs. 301 00:09:28.260 --> 00:09:30.750 Here we have $50,000, nice, bold typography. 302 00:09:30.750 --> 00:09:32.190 It's good, there's nothing wrong with that. 303 00:09:32.190 --> 00:09:33.350 But it doesn't perform so well 304 00:09:33.350 --> 00:09:35.113 in your peripheral vision either. 305 00:09:35.113 --> 00:09:35.946 So once again, if I really wanted to draw attention 306 00:09:35.946 --> 00:09:36.930 to that number, I might knock 307 00:09:36.930 --> 00:09:39.912 it out in the background color to really draw the eye. 308 00:09:39.912 --> 00:09:40.978 Now, you can't knock everything on the background color, 309 00:09:40.978 --> 00:09:41.811 'cause now the entire infographic 310 00:09:41.811 --> 00:09:43.500 is a big giant block of color. 311 00:09:43.500 --> 00:09:44.580 Now you're trying to emphasize everything, 312 00:09:44.580 --> 00:09:45.690 therefore you're emphasizing nothing, 313 00:09:45.690 --> 00:09:46.590 we'll talk more about that when we talk 314 00:09:46.590 --> 00:09:48.570 about design in a later module. 315 00:09:48.570 --> 00:09:50.760 So pre-attentive processing, very, very important. 316 00:09:50.760 --> 00:09:52.950 This is how you experience your world. 317 00:09:52.950 --> 00:09:55.560 It is also worth noting there is such a thing 318 00:09:55.560 --> 00:09:58.800 as attentive processing, which also works. 319 00:09:58.800 --> 00:09:59.970 So what does that mean? 320 00:09:59.970 --> 00:10:03.630 It means if I teach you at the beginning of a data story 321 00:10:03.630 --> 00:10:05.670 every time you see the color purple, 322 00:10:05.670 --> 00:10:08.160 it means, you know, I don't know, 323 00:10:08.160 --> 00:10:09.900 let's say it means mac and cheese. 324 00:10:09.900 --> 00:10:11.370 I'm doing a data story about food. 325 00:10:11.370 --> 00:10:12.690 Purple means mac and cheese, 326 00:10:12.690 --> 00:10:14.460 and you know, orange means broccoli, 327 00:10:14.460 --> 00:10:16.290 which by the way would be a terrible color selection, 328 00:10:16.290 --> 00:10:18.300 because of course purple should mean eggplant. 329 00:10:18.300 --> 00:10:20.370 And orange should mean mac and cheese. 330 00:10:20.370 --> 00:10:21.973 There's another lesson there 331 00:10:21.973 --> 00:10:23.220 about use colors that actually match the thing 332 00:10:23.220 --> 00:10:24.990 if appropriate and relevant. 333 00:10:24.990 --> 00:10:26.400 But in fact, let me use those examples. 334 00:10:26.400 --> 00:10:27.540 It's probably smarter just to talk that way. 335 00:10:27.540 --> 00:10:29.317 Purple means eggplant, orange means mac and cheese. 336 00:10:29.317 --> 00:10:33.090 Once I've established that, I can go 47 slides later, 337 00:10:33.090 --> 00:10:35.040 and on slide 47, you are primed, 338 00:10:35.040 --> 00:10:37.740 you are ready as soon as that chart appears to see purple 339 00:10:37.740 --> 00:10:39.360 and to know that means eggplant, and orange, 340 00:10:39.360 --> 00:10:40.980 and that means mac and cheese. 341 00:10:40.980 --> 00:10:42.510 When you are primed, in other words, 342 00:10:42.510 --> 00:10:44.730 you're being attentive to color, 343 00:10:44.730 --> 00:10:48.120 you will also very quickly see and detect patterns 344 00:10:48.120 --> 00:10:49.950 and understand the stuff in the data. 345 00:10:49.950 --> 00:10:51.720 So we'll talk more about this sort 346 00:10:51.720 --> 00:10:53.250 of design and triggering responses, et cetera. 347 00:10:53.250 --> 00:10:55.260 But do know that while your audience will by default 348 00:10:55.260 --> 00:10:57.480 receive information and have that pre-attentive response, 349 00:10:57.480 --> 00:10:59.460 if you prime them to a certain response, 350 00:10:59.460 --> 00:11:01.560 they also will react to that well as well. 351 00:11:02.472 --> 00:11:03.305 So yes, as I mentioned, you're gonna be reading 352 00:11:03.305 --> 00:11:05.557 from this book Colin Ware, 353 00:11:05.557 --> 00:11:08.471 "Information Visualization Perception for Design." 354 00:11:08.471 --> 00:11:10.680 It's a fairly heavy academic book about visual perception, 355 00:11:10.680 --> 00:11:12.570 and like I said, some technical stuff, 356 00:11:12.570 --> 00:11:14.730 skim past the stuff that doesn't seem relevant to you. 357 00:11:14.730 --> 00:11:15.870 You're not gonna be tested on this. 358 00:11:15.870 --> 00:11:17.950 Just be aware of that whole idea 359 00:11:17.950 --> 00:11:18.783 of pre-attentive processing, and there's more nuance 360 00:11:18.783 --> 00:11:20.010 in his book than what I've covered today. 361 00:11:20.010 --> 00:11:21.240 There's also the other book from Tamara Munzner, 362 00:11:21.240 --> 00:11:23.280 which talks about the relative importance and hierarchy. 363 00:11:23.280 --> 00:11:25.195 Definitely read that entire chapter, 364 00:11:25.195 --> 00:11:26.472 'cause you'll definitely learn stuff there. 365 00:11:26.472 --> 00:11:27.305 And yeah, I apologize that this is a lot 366 00:11:27.305 --> 00:11:28.138 of reading this week, most of this class 367 00:11:28.138 --> 00:11:29.970 is not about reading, but I think it's important stuff. 368 00:11:29.970 --> 00:11:31.200 But like I said, you know, skim it. 369 00:11:31.200 --> 00:11:33.120 You don't have to learn every single word 370 00:11:33.120 --> 00:11:35.010 from either of these readings, but I think you'll know 371 00:11:35.010 --> 00:11:37.050 what's relevant to you and what's less relevant. 372 00:11:37.050 --> 00:11:38.130 Just remember what I've talked about here today, 373 00:11:38.130 --> 00:11:39.300 and, you know, learn about that hierarchy 374 00:11:39.300 --> 00:11:40.680 is also very, very important. 375 00:11:40.680 --> 00:11:42.053 All right, thanks everybody.