1 00:00:02,250 --> 00:00:03,570 [Instructor] Hello, folks. 2 00:00:03,570 --> 00:00:06,780 Welcome to this week's lecture, 3 00:00:06,780 --> 00:00:11,670 which is on the ethics of social science. 4 00:00:11,670 --> 00:00:15,750 So we'll be learning about some of the key issues 5 00:00:15,750 --> 00:00:18,570 around doing ethical research 6 00:00:18,570 --> 00:00:22,560 so that you are able to both recognize, 7 00:00:22,560 --> 00:00:24,600 and most importantly, 8 00:00:24,600 --> 00:00:28,500 conduct social science research 9 00:00:28,500 --> 00:00:32,283 within ethical boundaries and norms. 10 00:00:35,490 --> 00:00:39,270 In this class, we are going to look at 11 00:00:39,270 --> 00:00:44,160 the ethics behind social science research. 12 00:00:44,160 --> 00:00:48,090 So we'll look at the key factors that we should look at 13 00:00:48,090 --> 00:00:53,090 to assure that the work that we are doing is ethical 14 00:00:53,430 --> 00:00:57,660 and learn a bit about some past controversies, 15 00:00:57,660 --> 00:01:02,373 which have led to the need for these sorts of measures, 16 00:01:03,480 --> 00:01:07,350 and look a little bit at the IRB, 17 00:01:07,350 --> 00:01:09,720 the Institutional Review Board, 18 00:01:09,720 --> 00:01:14,720 which is the organization at UVM which helps to ensure 19 00:01:15,210 --> 00:01:19,563 that all the research that is done is done ethically. 20 00:01:25,050 --> 00:01:28,890 There are many reasons why it is important 21 00:01:28,890 --> 00:01:31,713 that the research that we do is ethical. 22 00:01:39,390 --> 00:01:44,390 So here is a list of issues that we have to keep in mind 23 00:01:45,630 --> 00:01:49,710 when we conduct social science research, 24 00:01:49,710 --> 00:01:52,143 and I will go through each one. 25 00:01:58,620 --> 00:02:01,200 A cornerstone of ethical research 26 00:02:01,200 --> 00:02:04,320 is voluntary participation. 27 00:02:04,320 --> 00:02:08,370 That means that you cannot force somebody 28 00:02:08,370 --> 00:02:13,210 to be a part of a research study or project. 29 00:02:14,400 --> 00:02:16,590 You cannot coerce them. 30 00:02:16,590 --> 00:02:19,470 So if I were to say to you, 31 00:02:19,470 --> 00:02:24,470 I am requiring you to be a research subject in a project 32 00:02:27,750 --> 00:02:31,170 that doesn't have anything to do with this class, 33 00:02:31,170 --> 00:02:34,200 and if you don't, you're going to get a failing grade, 34 00:02:34,200 --> 00:02:36,900 that would be coercion. 35 00:02:36,900 --> 00:02:41,010 I'm not allowed to do that. 36 00:02:41,010 --> 00:02:44,040 Of course, I'm just saying this as an example, 37 00:02:44,040 --> 00:02:48,243 and I would lose my job, I think, if I were to do that. 38 00:02:49,680 --> 00:02:54,210 Part of voluntary participation as well 39 00:02:54,210 --> 00:02:58,890 is that it's important that your objects know 40 00:02:58,890 --> 00:03:03,890 how the research will be used and any benefits that they 41 00:03:04,920 --> 00:03:08,103 or society at large may gain. 42 00:03:13,380 --> 00:03:18,380 The next principle that we should mention is no harm. 43 00:03:21,690 --> 00:03:26,500 So we do not want to harm our research subject. 44 00:03:30,090 --> 00:03:35,090 And a lot of the motivation was because of past experiments, 45 00:03:38,490 --> 00:03:42,750 such as experiments where the Nazis 46 00:03:42,750 --> 00:03:46,440 did experiments on prisoners. 47 00:03:46,440 --> 00:03:51,420 And even though medical knowledge 48 00:03:51,420 --> 00:03:54,360 may have been advanced by this, 49 00:03:54,360 --> 00:03:59,360 it was done in a horrifically unethical way. 50 00:04:00,060 --> 00:04:05,060 There's also the famous Tuskegee syphilis study, 51 00:04:05,790 --> 00:04:10,790 where African-American men were told 52 00:04:13,650 --> 00:04:18,153 that they were getting a treatment for syphilis, 53 00:04:20,670 --> 00:04:24,730 which is a sexually transmitted disease. 54 00:04:26,280 --> 00:04:28,053 And in truth, 55 00:04:29,610 --> 00:04:32,670 these patients were not getting 56 00:04:32,670 --> 00:04:35,550 any effective treatment at all, 57 00:04:35,550 --> 00:04:40,550 and this was done so that researchers could study 58 00:04:40,740 --> 00:04:45,740 the development of the disease as it progressed 59 00:04:45,990 --> 00:04:50,990 from when someone gets it, all the way through. 60 00:04:52,260 --> 00:04:54,780 And again, even though medical science 61 00:04:54,780 --> 00:04:57,063 may have been advanced by this, 62 00:04:58,050 --> 00:05:02,160 the ends don't justify the the means 63 00:05:02,160 --> 00:05:04,890 that both of these are rightly seen 64 00:05:04,890 --> 00:05:08,013 as being horrible violations. 65 00:05:09,510 --> 00:05:14,250 And I encourage you to learn more about them. 66 00:05:14,250 --> 00:05:19,170 Now, the kind of work that I do as a social scientist 67 00:05:19,170 --> 00:05:21,960 and the kind of work that we will do 68 00:05:21,960 --> 00:05:26,960 and learn how to do in class are certainly not that extreme. 69 00:05:27,630 --> 00:05:31,770 However, there can be a potential to do harm 70 00:05:31,770 --> 00:05:35,310 in social science research, 71 00:05:35,310 --> 00:05:37,920 and we especially wanna make sure 72 00:05:37,920 --> 00:05:41,970 that we do not embarrass the subject, 73 00:05:41,970 --> 00:05:44,550 that we do not endanger their job, 74 00:05:44,550 --> 00:05:48,900 their marriage, their finances, their friendship, 75 00:05:48,900 --> 00:05:52,440 really even their reputation, 76 00:05:52,440 --> 00:05:56,327 nor do we want to harm them psychologically. 77 00:05:58,020 --> 00:06:01,680 And if any deception is used, 78 00:06:01,680 --> 00:06:05,370 then it is very important that we debrief them afterwards. 79 00:06:05,370 --> 00:06:07,350 And as an aside, 80 00:06:07,350 --> 00:06:11,490 I have never used deception in any of my work, 81 00:06:11,490 --> 00:06:16,143 nor will we use any deception in our classwork here. 82 00:06:21,030 --> 00:06:25,740 The absolute cornerstone of ethical research 83 00:06:25,740 --> 00:06:28,350 is informed consent, 84 00:06:28,350 --> 00:06:33,350 where the subject has a really good idea of the benefits 85 00:06:36,420 --> 00:06:40,350 of the study, how they may personally benefit, 86 00:06:40,350 --> 00:06:45,350 how society may benefit, how the research will be used, 87 00:06:45,990 --> 00:06:50,990 as well as any possible risk that they might face. 88 00:06:51,300 --> 00:06:56,300 And oftentimes, we have the subject sign, 89 00:06:57,750 --> 00:07:00,930 or give verbal consent signing a statement 90 00:07:00,930 --> 00:07:05,760 where they give their expressed permission to do so. 91 00:07:05,760 --> 00:07:07,500 And as we go through, 92 00:07:07,500 --> 00:07:10,440 I'm going to give you examples of these, 93 00:07:10,440 --> 00:07:13,440 which then you will use to gain the consent 94 00:07:13,440 --> 00:07:18,123 of the research subjects as we go through our project. 95 00:07:24,060 --> 00:07:29,060 Sometimes we are able to tell our subjects 96 00:07:30,420 --> 00:07:34,350 that their responses will be anonymous. 97 00:07:34,350 --> 00:07:37,770 So anonymous means neither the researcher 98 00:07:37,770 --> 00:07:42,600 nor the readers know who did or said what. 99 00:07:42,600 --> 00:07:47,520 So a mail survey or an online survey 100 00:07:47,520 --> 00:07:50,610 that doesn't track their IP address, 101 00:07:50,610 --> 00:07:53,100 we don't know who said what. 102 00:07:53,100 --> 00:07:54,963 This can be a bit hard, 103 00:07:55,842 --> 00:08:00,540 and then we cannot track who has responded or not. 104 00:08:00,540 --> 00:08:05,540 But it's a good way to ensure that nobody's responses 105 00:08:08,010 --> 00:08:10,890 will come back to harm them in some way, 106 00:08:10,890 --> 00:08:14,160 that if they say something on a survey 107 00:08:14,160 --> 00:08:19,160 that may be controversial, that we don't know who said that, 108 00:08:21,630 --> 00:08:25,263 and therefore it can't be tracked to them. 109 00:08:28,680 --> 00:08:33,137 A slightly lower bar is confidentiality. 110 00:08:34,200 --> 00:08:38,490 And this would be the case more like in an interview, 111 00:08:38,490 --> 00:08:40,440 that if you're interviewing someone, 112 00:08:40,440 --> 00:08:44,040 the researcher will know who said what. 113 00:08:44,040 --> 00:08:47,820 But what you do is make sure 114 00:08:47,820 --> 00:08:52,820 that you do not attribute any statement to any individual. 115 00:08:54,510 --> 00:08:58,710 This requires a training in ethics, 116 00:08:58,710 --> 00:09:02,940 which you will be doing for this class. 117 00:09:02,940 --> 00:09:07,290 It often involves removing identifying info. 118 00:09:07,290 --> 00:09:11,460 So when you have notes or transcripts, 119 00:09:11,460 --> 00:09:12,450 you have an interview, 120 00:09:12,450 --> 00:09:15,330 that you don't have the individual's name 121 00:09:15,330 --> 00:09:20,330 or any way that those statements could be traced back. 122 00:09:20,850 --> 00:09:24,030 And make sure that you know the difference 123 00:09:24,030 --> 00:09:28,170 between confidentiality and anonymity, 124 00:09:28,170 --> 00:09:32,103 and be able to give an example of this. 125 00:09:36,030 --> 00:09:39,960 The next issue is deception. 126 00:09:39,960 --> 00:09:44,103 So sometimes this is done in a lab experiment, 127 00:09:45,180 --> 00:09:49,623 how they may respond to some stimulus. 128 00:09:50,640 --> 00:09:55,640 So for example, I might say in class, again, 129 00:09:56,190 --> 00:10:00,000 this is a purely hypothetical example, 130 00:10:00,000 --> 00:10:03,450 not anything I would actually do, 131 00:10:03,450 --> 00:10:08,450 that you have to stand up on your desk 132 00:10:08,880 --> 00:10:13,880 and recite a poem in order to earn homework points 133 00:10:15,840 --> 00:10:16,950 for that day, 134 00:10:16,950 --> 00:10:21,240 while I'm not actually going to give 135 00:10:21,240 --> 00:10:24,090 or deny any homework points, 136 00:10:24,090 --> 00:10:27,003 I just wanna see if you will do it. 137 00:10:28,680 --> 00:10:33,390 And if you do this, again, we will not do it in this class. 138 00:10:33,390 --> 00:10:35,520 I have never used deception, 139 00:10:35,520 --> 00:10:40,520 nor can I imagine a time when I would, but if you do, 140 00:10:41,880 --> 00:10:45,540 you really have to debrief them and tell them like, 141 00:10:45,540 --> 00:10:48,750 look, I really am not going to take away 142 00:10:48,750 --> 00:10:50,550 or give homework points here. 143 00:10:50,550 --> 00:10:54,300 I just wanted to see how you would react 144 00:10:54,300 --> 00:10:56,073 to my asking about this. 145 00:11:01,620 --> 00:11:05,940 This is a humorous example of some of the elements 146 00:11:05,940 --> 00:11:10,940 of informed consent that we may face. 147 00:11:11,220 --> 00:11:15,330 So this is a clip from a Vermont-made film, 148 00:11:15,330 --> 00:11:20,070 called "A Man with a Plan". 149 00:11:20,070 --> 00:11:24,420 And it's about a Vermont dairy farmer who runs 150 00:11:24,420 --> 00:11:27,690 for the House of Representative, 151 00:11:27,690 --> 00:11:31,950 a man named Fred Tuttle. 152 00:11:31,950 --> 00:11:36,950 And in this scene, the sort of henchman of the incumbent 153 00:11:37,350 --> 00:11:41,373 who's running against Fred comes and he's, 154 00:11:42,829 --> 00:11:46,860 this henchman is trying to get dirt on Fred 155 00:11:46,860 --> 00:11:49,590 and find out bad things that will then embarrass Fred 156 00:11:49,590 --> 00:11:52,920 and harm his candidacy. 157 00:11:52,920 --> 00:11:55,050 And it's a really funny film. 158 00:11:55,050 --> 00:11:58,770 I encourage you, if you ever get a chance, to see it. 159 00:11:58,770 --> 00:12:01,890 But if this is a researcher, 160 00:12:01,890 --> 00:12:06,890 what are some of the elements and where does this person, 161 00:12:08,850 --> 00:12:13,750 this henchman violates standards of ethical research? 162 00:12:15,270 --> 00:12:17,703 And we'll discuss those in class. 163 00:12:22,260 --> 00:12:25,830 Part of ethical research also has to do 164 00:12:25,830 --> 00:12:29,220 with analysis and reporting. 165 00:12:29,220 --> 00:12:32,310 So an ethical researcher will be honest 166 00:12:32,310 --> 00:12:37,297 when they are reporting out what they did, any shortcomings, 167 00:12:38,430 --> 00:12:42,963 they have to list out what are the limitations. 168 00:12:44,640 --> 00:12:47,610 These are often placed in the very end 169 00:12:47,610 --> 00:12:49,500 of a research paper, 170 00:12:49,500 --> 00:12:54,500 being honest about what we found out and what we didn't, 171 00:12:54,930 --> 00:12:58,650 how far we can extrapolate, and things like that. 172 00:12:58,650 --> 00:13:02,850 And report on accidents, even fortuitous ones. 173 00:13:02,850 --> 00:13:06,990 So if you find something out completely by accident, 174 00:13:06,990 --> 00:13:11,130 it's good to be clear and honest 175 00:13:11,130 --> 00:13:14,160 that we were really not intending to learn this, 176 00:13:14,160 --> 00:13:17,640 but through the course of the project, 177 00:13:17,640 --> 00:13:19,440 we learned this new thing. 178 00:13:19,440 --> 00:13:22,350 And based on what we know now, 179 00:13:22,350 --> 00:13:27,350 it is a very important result of the research. 180 00:13:33,990 --> 00:13:38,990 All major universities, all colleges or universities, 181 00:13:42,900 --> 00:13:46,470 especially those that receive any sort of federal funding, 182 00:13:46,470 --> 00:13:49,860 must have an Institutional Review Board. 183 00:13:49,860 --> 00:13:53,730 So this is a board of faculty members and experts 184 00:13:53,730 --> 00:13:57,660 that review research protocols. 185 00:13:57,660 --> 00:14:00,240 So for example, in my work, 186 00:14:00,240 --> 00:14:02,610 I've done this for this class 187 00:14:02,610 --> 00:14:07,610 and any other research project, 188 00:14:09,810 --> 00:14:14,810 I have to tell them in advance what I'm going to do, 189 00:14:15,750 --> 00:14:20,750 what the purpose is, what the benefits are, 190 00:14:21,330 --> 00:14:25,920 what the risks are to my subjects. 191 00:14:25,920 --> 00:14:30,057 I have to show them the research questions, 192 00:14:31,110 --> 00:14:34,980 like the interview guide or the survey 193 00:14:34,980 --> 00:14:39,180 to tell them how I'm going to recruit subjects, 194 00:14:39,180 --> 00:14:41,703 and then have them approve it. 195 00:14:42,600 --> 00:14:46,810 And you will be actually doing a tutorial 196 00:14:48,120 --> 00:14:52,530 and sort of gaining a certification. 197 00:14:52,530 --> 00:14:56,223 And you'll learn about that more at the end of this lecture. 198 00:14:59,400 --> 00:15:03,450 The point of these institutional review boards 199 00:15:03,450 --> 00:15:05,370 is to guarantee that the rights 200 00:15:05,370 --> 00:15:07,920 and interests of the subject. 201 00:15:07,920 --> 00:15:10,290 It's to minimize risks. 202 00:15:10,290 --> 00:15:15,290 Now, many economic and social science studies 203 00:15:15,930 --> 00:15:18,300 are so-called exempt, 204 00:15:18,300 --> 00:15:22,080 in which it's seen that the bar is very low, 205 00:15:22,080 --> 00:15:25,560 that the risks are very minimal. 206 00:15:25,560 --> 00:15:30,210 So when we're asking students about their experience 207 00:15:30,210 --> 00:15:34,410 to think of some past examples with UVM dining, 208 00:15:34,410 --> 00:15:37,800 their attitudes about sustainability, 209 00:15:37,800 --> 00:15:42,800 their experiences with remote or other learning modalities, 210 00:15:46,260 --> 00:15:48,840 really, there isn't a great risk there. 211 00:15:48,840 --> 00:15:53,840 And these tend to be more sort of quickly approved 212 00:15:54,120 --> 00:15:56,310 and with a little bit more leeway. 213 00:15:56,310 --> 00:16:01,263 And we will use a consent process for our class. 214 00:16:06,120 --> 00:16:08,650 So far, what we've been talking about 215 00:16:10,140 --> 00:16:14,130 are pretty straightforward, well-established, 216 00:16:14,130 --> 00:16:19,130 almost no-brainer factors about ethical research, 217 00:16:20,610 --> 00:16:25,610 but there are many cases where you might mend up 218 00:16:25,740 --> 00:16:30,090 in some sort of unethical quandary, 219 00:16:30,090 --> 00:16:35,090 specifically that you find out, probably inadvertently, 220 00:16:35,880 --> 00:16:40,880 about something that one of your subjects is doing 221 00:16:41,040 --> 00:16:46,040 that may be something between illegal and unethical, 222 00:16:47,910 --> 00:16:52,910 such as maybe they are not abiding by good food safety regs. 223 00:16:54,630 --> 00:16:58,080 They are corrupt. 224 00:16:58,080 --> 00:17:00,480 They might be embezzling. 225 00:17:00,480 --> 00:17:03,060 You might find out trade secrets 226 00:17:03,060 --> 00:17:07,800 that the other businesses would really like to know about. 227 00:17:07,800 --> 00:17:12,800 You may witness examples of animal cruelty 228 00:17:15,420 --> 00:17:18,180 or things like that where you see things 229 00:17:18,180 --> 00:17:20,040 that you really think are wrong. 230 00:17:20,040 --> 00:17:21,900 So how would you handle these? 231 00:17:21,900 --> 00:17:24,933 I don't have a right or wrong answer in mind. 232 00:17:26,370 --> 00:17:31,370 And what happens if you turn around and turn them in, 233 00:17:33,360 --> 00:17:36,513 or what happens if you look the other way? 234 00:17:37,710 --> 00:17:42,710 Again, I'm not looking for a specific right or wrong answer, 235 00:17:42,780 --> 00:17:44,880 but as a social scientist, 236 00:17:44,880 --> 00:17:48,810 you may face these kinds of issues. 237 00:17:48,810 --> 00:17:53,810 And we'll discuss them in more length in class. 238 00:18:01,380 --> 00:18:06,380 So this is what we talked about, about ethical issues, 239 00:18:06,570 --> 00:18:11,193 and I look forward to talking to you more about it in class.