WEBVTT 1 00:00:01.440 --> 00:00:05.070 Hello and welcome to the lecture 2 00:00:05.070 --> 00:00:07.320 on social science research ethics, 3 00:00:07.320 --> 00:00:09.570 where we're gonna be talking about 4 00:00:09.570 --> 00:00:12.510 how to do ethical research, 5 00:00:12.510 --> 00:00:16.383 and what are some of the considerations that go into that. 6 00:00:19.500 --> 00:00:21.810 So we're gonna talk about ethical issues. 7 00:00:21.810 --> 00:00:25.800 We're gonna talk about a few controversies from the past 8 00:00:25.800 --> 00:00:28.110 that have motivated this, 9 00:00:28.110 --> 00:00:32.190 and then the IRB, the Institutional Review Board, 10 00:00:32.190 --> 00:00:34.710 which is part of UVM, 11 00:00:34.710 --> 00:00:37.680 and they ensure that researchers like us 12 00:00:37.680 --> 00:00:40.263 will do ethical research. 13 00:00:43.770 --> 00:00:45.480 So for many reasons, 14 00:00:45.480 --> 00:00:50.480 it's vital that we do research that is ethical, 15 00:00:50.490 --> 00:00:53.430 from everything from avoiding being sued 16 00:00:53.430 --> 00:00:56.610 to just your own personal integrity. 17 00:00:56.610 --> 00:00:59.210 There are many reasons why we wanna make sure 18 00:00:59.210 --> 00:01:01.410 that what we do is ethical. 19 00:01:01.410 --> 00:01:03.660 That not only it has value, 20 00:01:03.660 --> 00:01:08.660 but it does no harm to the subjects of our research. 21 00:01:11.940 --> 00:01:16.740 So we're gonna focus in here on the research subjects, 22 00:01:16.740 --> 00:01:21.150 those that we collect data from, 23 00:01:21.150 --> 00:01:25.020 and we're gonna go through this list of things, 24 00:01:25.020 --> 00:01:28.920 of what we should be concerned about, 25 00:01:28.920 --> 00:01:33.920 as well as being ethical in our analysis and reporting, 26 00:01:34.729 --> 00:01:37.323 and then we're gonna talk about the IRB. 27 00:01:39.810 --> 00:01:44.810 So it's vital that your subjects participate voluntarily, 28 00:01:46.740 --> 00:01:51.740 that it's not ethical for you to force somebody to do it. 29 00:01:51.870 --> 00:01:55.387 So for me to do research and say, 30 00:01:55.387 --> 00:01:59.280 "I will fail you in this class if you don't," 31 00:01:59.280 --> 00:02:02.640 you know, any sort of threat is not ethical, 32 00:02:02.640 --> 00:02:07.640 and that they have to do it of their own free will. 33 00:02:09.270 --> 00:02:12.690 It's important that you clearly articulate 34 00:02:12.690 --> 00:02:15.240 what is the purpose of the research, 35 00:02:15.240 --> 00:02:18.540 what are any benefits that could come to them 36 00:02:18.540 --> 00:02:21.090 or to society as a whole, 37 00:02:21.090 --> 00:02:24.030 and to not use any coercion. 38 00:02:24.030 --> 00:02:25.410 You know, don't threaten them, 39 00:02:25.410 --> 00:02:27.630 don't say bad things will happen, 40 00:02:27.630 --> 00:02:28.770 like me, don't fail, 41 00:02:28.770 --> 00:02:31.470 you know, I can't fail you if you don't do it. 42 00:02:31.470 --> 00:02:35.610 Or you know, give you a bad grade or anything like that. 43 00:02:35.610 --> 00:02:37.143 That they have to volunteer. 44 00:02:40.620 --> 00:02:45.620 There have been some famous cases of unethical research, 45 00:02:46.170 --> 00:02:49.110 and I'll leave these links here, feel free to look. 46 00:02:49.110 --> 00:02:54.110 But the Nazis conducted experiments on people 47 00:02:55.290 --> 00:02:58.500 in the concentration camps. 48 00:02:58.500 --> 00:03:03.500 And there's a very well known study from Tuskegee, 49 00:03:04.980 --> 00:03:09.180 where African-American men 50 00:03:09.180 --> 00:03:14.180 were deliberately not treated for syphilis 51 00:03:14.460 --> 00:03:19.460 so they could follow the progress of the disease, you know, 52 00:03:19.860 --> 00:03:24.860 and in neither of these cases did the subjects volunteer. 53 00:03:25.980 --> 00:03:29.321 That it was done against their will, 54 00:03:29.321 --> 00:03:32.670 and these are clear sort of motivations 55 00:03:32.670 --> 00:03:37.670 that have led us to the sort of policies and frameworks 56 00:03:39.570 --> 00:03:40.833 that we have now. 57 00:03:44.190 --> 00:03:48.570 So basically, we don't want to harm our subjects. 58 00:03:48.570 --> 00:03:53.570 It's so much more than the two things on the last slide. 59 00:03:53.670 --> 00:03:57.000 We don't want to embarrass them. 60 00:03:57.000 --> 00:03:59.250 We don't want to endanger sort of their job, 61 00:03:59.250 --> 00:04:00.750 their friendships, their marriage, 62 00:04:00.750 --> 00:04:02.220 their finances. 63 00:04:02.220 --> 00:04:05.370 And we don't want to harm them psychologically. 64 00:04:05.370 --> 00:04:08.190 And there have been some studies in the past 65 00:04:08.190 --> 00:04:12.360 where the subjects were harmed psychologically, 66 00:04:12.360 --> 00:04:16.110 and we're gonna talk a little bit soon about 67 00:04:16.110 --> 00:04:18.180 if you ever use deception, 68 00:04:18.180 --> 00:04:20.460 that you have to debrief them at the end 69 00:04:20.460 --> 00:04:22.710 and tell them the truth. 70 00:04:22.710 --> 00:04:25.113 But I'll say a bit more about that. 71 00:04:26.280 --> 00:04:31.280 So the cornerstone of ethical research with human subjects 72 00:04:32.580 --> 00:04:35.223 is called informed consent, 73 00:04:36.180 --> 00:04:41.180 where you clearly tell them what are the benefits to them, 74 00:04:42.090 --> 00:04:43.860 what are the benefits of society, 75 00:04:43.860 --> 00:04:45.180 why is this being done, 76 00:04:45.180 --> 00:04:49.680 and is there any risk to their participation? 77 00:04:49.680 --> 00:04:51.960 And in most cases, 78 00:04:51.960 --> 00:04:56.382 they either verbally or written, 79 00:04:56.382 --> 00:05:00.990 sign or give their written or oral consent, 80 00:05:00.990 --> 00:05:03.570 where they read the information 81 00:05:03.570 --> 00:05:05.677 and then they either sign or they say, 82 00:05:05.677 --> 00:05:09.600 "Yes, I volunteer." 83 00:05:09.600 --> 00:05:14.600 So they give their consent based on a lot of information 84 00:05:15.660 --> 00:05:17.550 about why the study's being done, 85 00:05:17.550 --> 00:05:18.750 what the benefits are, 86 00:05:18.750 --> 00:05:21.633 and if any, if there are any risks to them. 87 00:05:26.610 --> 00:05:31.143 So two similar concepts. 88 00:05:34.380 --> 00:05:38.580 I'm gonna talk about anonymity and confidentiality, 89 00:05:38.580 --> 00:05:43.580 and they are similar, but not exactly the same. 90 00:05:45.600 --> 00:05:49.267 So in neither case do you say, 91 00:05:49.267 --> 00:05:52.410 "This subject, this person's name, 92 00:05:52.410 --> 00:05:55.860 said this response." 93 00:05:55.860 --> 00:05:59.160 Both of these will protect against that. 94 00:05:59.160 --> 00:06:01.080 So when it's anonymous, 95 00:06:01.080 --> 00:06:03.726 neither the researcher nor the readers 96 00:06:03.726 --> 00:06:08.726 of say the research article, knows who said what. 97 00:06:10.890 --> 00:06:15.030 That something like a mail-in or an online survey 98 00:06:15.030 --> 00:06:17.835 where you don't ask for their name, 99 00:06:17.835 --> 00:06:21.630 that nobody knows who said what, 100 00:06:21.630 --> 00:06:23.430 except for the respondent. 101 00:06:23.430 --> 00:06:26.403 Again, this is very common in surveys. 102 00:06:28.260 --> 00:06:31.440 If you're doing something like an interview, 103 00:06:31.440 --> 00:06:35.070 then all that you can say is it will be confidential. 104 00:06:35.070 --> 00:06:37.528 That you do know who said what, 105 00:06:37.528 --> 00:06:38.490 because you were there, 106 00:06:38.490 --> 00:06:40.683 you as the researcher heard them. 107 00:06:42.720 --> 00:06:47.720 So this requires a training like this, in research ethics, 108 00:06:52.830 --> 00:06:57.570 and in many cases you remove the identifying info, 109 00:06:57.570 --> 00:07:02.570 that at very most, if it's important what their role is, 110 00:07:04.170 --> 00:07:06.997 you might say as much as, 111 00:07:06.997 --> 00:07:09.300 "A student said that," 112 00:07:09.300 --> 00:07:10.133 blah, blah, blah, 113 00:07:10.133 --> 00:07:11.970 or "A faculty member," 114 00:07:11.970 --> 00:07:13.920 but you don't ever give their name, 115 00:07:13.920 --> 00:07:18.920 that you don't ever attribute what they said 116 00:07:19.470 --> 00:07:21.210 to any individual. 117 00:07:21.210 --> 00:07:23.940 And that is what confidentiality means. 118 00:07:23.940 --> 00:07:25.440 And it'd be good if you could, 119 00:07:26.910 --> 00:07:31.910 that you know the difference and can provide an example. 120 00:07:37.050 --> 00:07:40.647 Sometimes in social science or psychological research, 121 00:07:40.647 --> 00:07:42.780 there's deception, 122 00:07:42.780 --> 00:07:46.260 that you tell somebody something that's not true, 123 00:07:46.260 --> 00:07:48.300 make them think they're doing something, 124 00:07:48.300 --> 00:07:52.640 making them think that there is a consequence 125 00:07:54.120 --> 00:07:56.250 to what they do. 126 00:07:56.250 --> 00:07:59.640 I will say that I have never used deception. 127 00:07:59.640 --> 00:08:02.673 I do not, I can't imagine that I would, 128 00:08:03.780 --> 00:08:05.640 but it has been done. 129 00:08:05.640 --> 00:08:08.220 And if you do, you really have to, 130 00:08:08.220 --> 00:08:12.600 after the fact, tell them about what they did, 131 00:08:12.600 --> 00:08:14.220 what the real reasons were, 132 00:08:14.220 --> 00:08:18.390 fill them in on what the deception is and why. 133 00:08:18.390 --> 00:08:23.061 But we will not be using any deception in this class, 134 00:08:23.061 --> 00:08:27.510 nor have I ever, but it has been done, 135 00:08:27.510 --> 00:08:32.510 and it is possible to do that under certain circumstances. 136 00:08:41.670 --> 00:08:44.490 Have a look at this video clip 137 00:08:44.490 --> 00:08:49.173 and see what are the elements of informed consent here. 138 00:08:53.430 --> 00:08:55.990 Would what this individual does 139 00:08:57.030 --> 00:09:01.170 sort of pass in a review of ethics? 140 00:09:01.170 --> 00:09:03.150 And what would, and what wouldn't, 141 00:09:03.150 --> 00:09:03.983 and why not? 142 00:09:06.870 --> 00:09:11.870 So even once you're done interacting with your subjects, 143 00:09:12.390 --> 00:09:14.910 there are still a lot of ethical issues 144 00:09:14.910 --> 00:09:17.400 that you really wanna be honest about. 145 00:09:17.400 --> 00:09:20.070 What are the shortcomings of your study? 146 00:09:20.070 --> 00:09:23.340 What are the limitations? 147 00:09:23.340 --> 00:09:28.340 And that's very often the last part of the article. 148 00:09:31.320 --> 00:09:34.950 And you wanna report about accidents, 149 00:09:34.950 --> 00:09:38.373 things that you sort of, unexpectedly happened, 150 00:09:40.680 --> 00:09:42.720 even if they are good. 151 00:09:42.720 --> 00:09:44.820 You always wanna be very upfront 152 00:09:44.820 --> 00:09:46.972 about what you did and why, 153 00:09:46.972 --> 00:09:49.713 and sort of what the outcomes are. 154 00:09:52.020 --> 00:09:56.940 So we at UVM are governed by an organization 155 00:09:56.940 --> 00:09:59.310 called the Institutional Review Board. 156 00:09:59.310 --> 00:10:02.400 This comes from federal law. 157 00:10:02.400 --> 00:10:05.940 If you get any sort of federal grant, 158 00:10:05.940 --> 00:10:10.940 you must run your protocols through this. 159 00:10:11.100 --> 00:10:16.100 And they're, the same basic kinds of protocols are used. 160 00:10:17.665 --> 00:10:22.665 And you will be doing an exercise, 161 00:10:24.660 --> 00:10:27.090 getting the approval. 162 00:10:27.090 --> 00:10:30.840 So all researchers that do research 163 00:10:30.840 --> 00:10:33.120 that use human subjects 164 00:10:33.120 --> 00:10:37.500 have to be certified by the IRB. 165 00:10:37.500 --> 00:10:40.500 You take a tutorial and you pass it, 166 00:10:40.500 --> 00:10:42.540 and you get a certificate, 167 00:10:42.540 --> 00:10:46.017 and that'll be a part of this class as well. 168 00:10:46.017 --> 00:10:48.243 And you'll see more about that. 169 00:10:51.030 --> 00:10:54.090 So their role is to guarantee 170 00:10:54.090 --> 00:10:57.570 the rights and interests of the research subject, 171 00:10:57.570 --> 00:11:00.390 they're there to minimize risks. 172 00:11:00.390 --> 00:11:02.220 And most of what we do, 173 00:11:02.220 --> 00:11:05.910 so this, here at UVM, 174 00:11:05.910 --> 00:11:10.910 this same IRB oversees all the medical research too. 175 00:11:11.010 --> 00:11:12.627 So there's, you know, 176 00:11:12.627 --> 00:11:14.010 certainly a lot more risk at that, 177 00:11:14.010 --> 00:11:16.920 of like a new experimental drug, 178 00:11:16.920 --> 00:11:21.480 or a heart valve, than the kind of work that we do, 179 00:11:21.480 --> 00:11:25.440 which is asking folks about their experiences, 180 00:11:25.440 --> 00:11:26.970 their attitudes, awareness, 181 00:11:26.970 --> 00:11:28.890 beliefs, behaviors, 182 00:11:28.890 --> 00:11:30.330 things like that. 183 00:11:30.330 --> 00:11:34.710 But we will have a consent process 184 00:11:34.710 --> 00:11:37.713 that we use for this class. 185 00:11:40.080 --> 00:11:45.080 So there are certain areas that 186 00:11:45.390 --> 00:11:49.200 if you are a professional researcher, 187 00:11:49.200 --> 00:11:51.420 that you may encounter, 188 00:11:51.420 --> 00:11:55.380 where I guess a good way of saying it is, 189 00:11:55.380 --> 00:11:59.040 you learn things you really wish that you hadn't learned, 190 00:11:59.040 --> 00:12:04.040 that someone sort of inadvertently told you something 191 00:12:04.110 --> 00:12:07.170 that could get them into trouble, 192 00:12:07.170 --> 00:12:12.053 or that if you shared it would really adversely affect them. 193 00:12:12.053 --> 00:12:15.030 And some examples here, 194 00:12:15.030 --> 00:12:16.710 you know, food safety, 195 00:12:16.710 --> 00:12:21.387 they're not using good standards, 196 00:12:22.410 --> 00:12:25.440 they're embezzling, they're being abused. 197 00:12:25.440 --> 00:12:27.273 Things like trade secrets, 198 00:12:28.920 --> 00:12:32.763 animal cruelty, 199 00:12:32.763 --> 00:12:36.540 maybe unsafe working conditions. 200 00:12:36.540 --> 00:12:40.920 What will you do if you are there and you find that out? 201 00:12:40.920 --> 00:12:42.300 What will you do? 202 00:12:42.300 --> 00:12:44.340 Do you look the other way? 203 00:12:44.340 --> 00:12:45.963 Do you turn them in? 204 00:12:47.670 --> 00:12:52.140 What would happen if you sort of "gotcha" 205 00:12:52.140 --> 00:12:54.289 your research subjects? 206 00:12:54.289 --> 00:12:59.289 What would that mean to you in your career as a researcher? 207 00:13:03.277 --> 00:13:06.030 I'm not saying that there's a clear line, 208 00:13:06.030 --> 00:13:08.400 that there's a clear right and wrong. 209 00:13:08.400 --> 00:13:11.163 These are things that you will have to deal with. 210 00:13:11.163 --> 00:13:13.380 And I just wanted to raise them, 211 00:13:13.380 --> 00:13:16.890 that there may be some sort of gray area here 212 00:13:16.890 --> 00:13:18.873 that you may have to navigate. 213 00:13:22.830 --> 00:13:24.780 The last thing is, 214 00:13:24.780 --> 00:13:29.250 can social science be free of ideology? 215 00:13:29.250 --> 00:13:31.290 Is there work that we can do 216 00:13:31.290 --> 00:13:35.400 that doesn't have sort of an underpinning 217 00:13:35.400 --> 00:13:39.750 of politics or policy? 218 00:13:39.750 --> 00:13:44.750 Can we do sort of pure research in this space? 219 00:13:45.060 --> 00:13:46.890 And I would argue no, 220 00:13:46.890 --> 00:13:51.890 that objectivity is a direction, 221 00:13:52.830 --> 00:13:55.830 that we all have our biases. 222 00:13:55.830 --> 00:13:57.780 And just, I mean, 223 00:13:57.780 --> 00:13:59.610 I would say also that, 224 00:13:59.610 --> 00:14:04.460 the type of research that we do reflects our values. 225 00:14:07.290 --> 00:14:09.520 That I do work in 226 00:14:11.070 --> 00:14:16.070 sustainable community economic development, 227 00:14:16.290 --> 00:14:18.540 and in food and agriculture. 228 00:14:18.540 --> 00:14:21.930 And I argue, I guess, 229 00:14:21.930 --> 00:14:23.910 that there's no sort of, 230 00:14:23.910 --> 00:14:27.030 those always have sort of social, 231 00:14:27.030 --> 00:14:30.210 political, economic implications. 232 00:14:33.930 --> 00:14:36.450 Some people will agree with you, some won't. 233 00:14:36.450 --> 00:14:40.717 You'll be sort of favoring some groups above others. 234 00:14:40.717 --> 00:14:45.717 But think about what is the type of research 235 00:14:46.080 --> 00:14:51.080 that you would feel comfortable doing, and why. 236 00:14:54.540 --> 00:14:56.340 So this is the recap. 237 00:14:56.340 --> 00:14:58.350 This is what we discussed. 238 00:14:58.350 --> 00:15:01.533 And thank you!