jrp logo link

Journal of Research Practice

Volume 9, Issue 2, Article M11, 2013


Main Article:
The Ethical Treatment of Research Assistants: Are We Forsaking Safety for Science?

Karen Z. Naufel
Department of Psychology
Georgia Southern University
Statesboro, GA 30460, UNITED STATES
knaufel@georgiasouthern.edu

Denise R. Beike
Department of Psychological Science
University of Arkansas, 216 Memorial Hall
Fayetteville, AR 72701, UNITED STATES
dbeike@uark.edu

Abstract

Science inevitably involves ethical discussions about how research should be implemented. However such discussions often neglect the potential unethical treatment of a third party: the research assistant. Extensive anecdotal evidence suggests that research assistants can experience unique physical, psychological, and social risks when implementing their typical responsibilities. Moreover, these research assistants, who perhaps engage in research experience to bolster their curricula vitae, may feel coerced to continue to work in unsafe environments out of fear of losing rapport with the research supervisor or letters of recommendation for their future endeavors. In the present article, we address two important issues regarding the ethical treatment of research assistants. First, we present evidence suggesting that research assistants may experience substantive risk when implementing their assigned responsibilities. Second, we propose a document, the “Research Assistant’s Bill of Rights,” as a possible ethics code for people supervising research assistants. This document is independent from typical institutional review board processes, and it has the potential to maximize benefits to research supervisors and research assistants.

Index Terms: research training; research policy; research assistantships; research supervision; research ethics; experimenter rights; research assistant’s bill of rights

Suggested Citation: Naufel, K. Z., & Beike, D. R. (2013). The ethical treatment of research assistants: Are we forsaking safety for science? Journal of Research Practice, 9(2), Article M11. Retrieved from http://jrp.icaap.org/index.php/jrp/article/view/360/318



1. Introduction

Ethics is an integral component of research training. Across disciplines, research supervisor and research assistant (RA) may converse about ethical issues such as scientific misconduct, authorship, and conflict of interest. Additionally, social scientists have discussions about ethical responsibilities when collecting data from humans. Specifically, these discussions involve the key principle guiding human research, which is that the potential benefits of research must prevail over the potential risks to the human participant (Belmont Report, 1979).

We continuously apply ethics when implementing research, but one of us realized the limitations of our ethical training when she collected data for her master’s thesis. Her research investigated how people anticipated and remembered unpleasant events, meaning that participants would be exposed to an unpleasant stimulus (in this case, watching a disgusting video of surgeries). Additionally, this research investigated how expectations regarding the timing of events affected people’s anticipation and memory for these events, meaning that RAs would have to deceive participants about when they would experience the unpleasant event. The participants were told they would watch this video at a later date only to learn they were watching it that day. The ethical conundrums involving participants were thus twofold: participants would potentially experience the risks of distress (watching an unpleasant video) and deception (watching the video earlier than expected).

To minimize risks, she and her supervisor ensured that precautions were in place to protect participants from distress. Her Institutional Review Board (IRB), the board responsible for upholding human participant rights, approved the study. When it came time to proceed with the study, she felt confident that the participants’ rights were protected to the greatest extent possible. She then enlisted several RAs to help conduct the research associated with this master’s thesis.

Her ethical training, or any training for that matter, did not prepare her for the emotional toll that conducting this research would have on her and her RAs. The study involved over 100 participants, meaning that RAs experienced all aspects of the study over 100 times. That is, they endured watching a disgusting video over 100 times, watching participants show disgust, faintness, and displeasure over 100 times, and telling a flimsy lie to participants over 100 times. Every time a participant entered the laboratory, she felt the sense of dread and embarrassment grow inside her, and her RAs reported that they often shared this dread. She had anticipated and handled the risks to the participants but she had not anticipated the risks to the RAs conducting the sessions.

This in-lab project experienced by one of the authors when completing her master’s thesis prompted us to examine the topic of ethics from an alternative perspective, that of the RA. We first turned to the literature within our own discipline, psychology, to examine if RAs could experience harm when implementing their tasks. However, we found little relevant systematic research had been conducted within psychology. Instead, we found a mixture of theoretical evidence and personal narratives that suggest that psychology RAs may experience minimal to substantive risk when conducting research (see Bering, 2009; Oliansky, 1991).

Next, we examined literature from our discipline and other disciplines to determine what guidelines existed for handling potential risks to the RA. We found that RAs in other disciplines may also be susceptible to risks, but specific guidelines for how to reduce such risks were non-existent, non-specific, or unenforced. Additionally, it seems that many supervisors of RAs, at least within psychology, may be unaware that such risks exist.

We therefore discuss the risks that RAs face when engaging in psychological research, and we describe how RAs in other fields may experience similar risks. We also propose a document, the Research Assistant’s Bill of Rights, as a possible ethics code for those supervising RAs across disciplines. This document unifies the research and policy of several disciplines in order to maximize the benefits of research to both RAs and research supervisors.

2. Risks to the Research Assistant

Based upon our reading of the research literature in psychology and other disciplines, anecdotes reported in the literature and in person to the authors, and our own personal experiences, we noted several types of risk to which RAs in various disciplines are routinely exposed. To be clear, risk is not equivalent to harm. For example, being exposed to radiation presents an increased risk of cancer, but an individual person so exposed may or may not actually be harmed to the extent that he or she develops cancer.

We classified the types of risks that RAs could experience into one of three categories: physical risks (the potential for experiencing physical harm), psychological risk (the potential for experiencing emotional distress), and social risk (the potential for being viewed unfavorably by others). A classic study involving the norm of waiting in line illustrates how RAs may experience all three types of risk. Around several areas of New York City, RAs intruded into existing lines, and observers measured how people responded (Milgram, Liberty, Toledo, & Wackenhut, 1986). The intruding RAs experienced disapproval from many of the unknowing participants. People yelled at, scowled at, and even physically ejected the RAs from line, behaviors that suggest that other people disapproved of the RAs’ actions. Moreover, Milgram et al. reported that RAs were displaying aversive reactions to their assigned responsibility to interrupt a line, sometimes even resulting in physical ailments such as nausea. Thus, this study imposed several risks to the RAs: physical risk (being ejected from line), psychological risk (emotional angst as a result of the social disapproval), and social risk (being negatively evaluated by others).

RAs in various fields can experience a myriad of other physical, psychological, and social risks. Nonetheless these risks are not always acknowledged in the training of RAs or in ethical guidelines of many fields. For now, we begin by documenting instances in which RAs have or could have experienced one of these three risks.

2.1. Physical Risk

RAs may experience physical risk, or potential harm to the body, when conducting experiments. Several years ago, a psychologist recounted historical studies in which researchers put themselves in dangerous positions for the sake of science (Bering, 2009). Bering described a classic study (i.e., Harari, Harari, & White, 1985) in which researchers measured the extent that passersby responded to a victim’s cry of rape. As Bering recounted the study (e.g., the researchers staged a fake rape scene in the park), he noted the potential peril in which the researchers had placed themselves. That is, Bering argued that a passerby could intervene violently, perhaps physically harming the researcher pretending to act as the rapist.

Bering (2009) merely described possible danger, but some authors have reported that RAs have been directly harmed as a result of research. For instance, Stanley Milgram (1974) recounted such an incident in his laboratory (Experiment 13a). In this experiment, participants were asked to administer seemingly painful shocks to a participant in another room. (In reality, no one in Milgram’s experiments was actually receiving the shocks.) When a participant showed reluctance to administer shocks, Milgram’s RA, disguised as an “ordinary man” (p. 90), encouraged the participant to proceed. If the participant refused, then this RA acted as if he were going to deliver the shocks. As Milgram described, one-quarter of the participants physically halted the RA from delivering shocks. Milgram even described a particular confrontation in which a participant “lifted the zealous shocker from his chair, threw him to the corner of the laboratory, and did not allow him to move until he had promised not to administer further shocks” (p. 97). As can be gleaned from this passage, Milgram’s RA experienced physical risk as a result of this study.

Research may not require a direct measure of violence or defiance for an RA to experience physical risk. For example, a researcher examined if drivers passed cyclists more closely when cyclists wore helmets or did not wear helmets (Walker, 2007). Walker took his experiment to the streets, cycling on public roads and measuring the distance at which cars passed. Walker reported being struck by a vehicle twice when conducting this experiment. Walker did not indicate if he anticipated such an adverse consequence to conducting his research. In hindsight, however, it does seem plausible that such a physical risk could have happened given the research question.

We soon learned that psychological research was not the only discipline subjecting RAs to risks. For instance, sociologist Lankshear (2000) discussed the physical hazards of implementing an ethnographic study in a hospital-like setting. She had planned to study “the daily work of the laboratory” (p. 74), merely observing the work of technicians and interviewing employees. However, the process of implementing her research exposed her to numerous physical harms, including potential exposure to pathogens that she had not considered until she was working within that environment. She also reported that she was unaware that she needed Health and Safety training, which included recommendations for vaccinations, until she had already begun her research in the lab. In sum, her research endeavor resulted in her putting herself at risk of microbial infections.

Walker (2007) and Lankshear (2000) were both the primary researchers of their studies rather than RAs. Yet it is conceivable that RAs may experience the same risks as these researchers. A research supervisor, for instance, may ask RAs to ride a bicycle on the roads in order to reduce the extent that his or her own biases influence data collection. In doing so, the RAs are now at risk for being hit by a vehicle. Similarly, a supervisor could send RAs to a hospital setting to collect data. As in the case of Lankshear, a supervisor may not have considered the potential risk of exposure to pathogens. Or, the supervisor may assume that RAs will receive adequate safety training (and the RAs do not). In either case, RAs may unknowingly be exposed to health risks as a result of the supervisor’s assumptions.

RAs in other disciplines may also face physical risks when implementing their responsibilities. Chemists, biologists, geologists, engineers, and medical researchers may handle hazardous compounds when conducting their studies, but the extent to which their RAs are aware of the risks of these compounds can vary. The American Chemical Society, for instance, emphasizes the importance of protecting all lab personnel, including RAs, from physical risks (American Chemical Society, 2012). However, national policies on safety only reduce risk to RAs if these policies are enforced. As an example, RA Sheharbano Sangji died as a result of injuries from a fire that occurred while she was conducting research in a chemistry lab (Kemsley, 2009). In her summary, Kemsley noted several safety violations in Sangji’s laboratory (e.g., Sangji had not received safety training; her supervisor did not fix documented safety concerns in the lab). Also, Kemsley acknowledged that Sangji’s qualifications for implementing the research procedure that killed her were questionable. Sangji’s supervisor asserted that Sangji was qualified to implement the procedure safely, but her documentable experiences suggested that she may have needed more supervision. The tragic story of Sangji reinforces how RAs can experience risk, particularly if safety precautions and proper training are not adequately addressed.

2.2. Psychological Risk

In addition to physical risk, RAs may experience psychological risks, or potential emotional harm when conducting a study. Typical experiments within the social sciences can pose any number of psychological risks to the RA. Our RAs have experienced such psychological risks firsthand. We conducted experiments in which participants wrote about unpleasant experiences for 20 minutes over four days. Then we, along with our RAs, typed these essays. The task of typing seemed rather harmless; it was just simple data transcription. In fact, the first few narratives were intriguing, providing a glimpse into someone else’s thought processes.

Soon, distress overshadowed the initial intrigue. Participants wrote about tragic events, such as rape, deaths of loved ones, or medical scares, with emotional fervor, and we began to experience some version of our participants’ emotions. One of us even remembers how the emotion of the events lingered after typing narratives, with the unpleasant experiences sometimes even pervading her dreams. In short, our RAs experienced distress as a result of data transcription.

The idea that repetitive emotional detail can evoke distress is supported in the literature. For instance, clinical psychologists may experience symptoms of distress when hearing their clients’ accounts of traumatic experiences, a phenomenon known as vicarious traumatization (McCann & Pearlman, 1990). The evidence is mixed, but some research suggests that copious exposure to traumatic information may intensify vicarious traumatization (Baird & Kracen, 2006). Additionally, copious exposure may trigger another form of distress, secondary traumatic stress, in which people become emotionally taxed from experiencing concern for the tragedies that befall others (Baird & Kracen, 2006).

One of our colleagues had a similar issue when conducting her dissertation research on the effects of pornography on relationships. Her RAs were asked to code the content of pornographic movies shown in the laboratory. After several weeks, the RAs began to report trauma symptoms such as unpleasant flashbacks and nightmares including the pornographic material. Thus, a relatively common RA duty of coding data may trigger either vicarious traumatization or secondary traumatic stress depending on the data being coded.

Other disciplines, too, transcribe narratives and engage in activities that would evoke similar distress. For example, researchers in the medical fields may study how their patients handle illness, and thus their RAs listen to, code, and transcribe narratives about their patients’ suffering. Similarly, historians may be interested in a particular aspect of war or terrorism, so their RAs may transcribe, review, or evaluate people’s traumatic experiences (e.g., newspaper accounts of genocide or diary entries describing coping with political turmoil) for their research purposes. Moreover, RAs in any of these disciplines are unlikely to have just a single exposure to such unpleasant stimuli. Instead, RAs are more likely to have repeated exposures to unpleasant stimuli, perhaps coding or witnessing hundreds of events. Repeated, prolonged exposure to such stimuli may increase distress in general (Baird & Kracen, 2006), implying that such exposure may raise the psychological risk to RAs as well. In fact, the Sexual Violence Research Initiative recognizes the potential problem of vicarious traumatization. Therefore, the organization recommends that the risk to RAs be minimized by limiting them to no more than three interviews with sexual violence perpetrators and victims per day (Jewkes, Dartnall, & Sikweyiya, 2012).

Other types of experiment in the social sciences can also accrue other, possibly severe psychological risks to RAs due to the use of deception. RAs may be instructed to deceive participants in order to preserve the validity of the study or to elicit a response, a common practice in some branches of psychology. For example, a study may be designed to test the effects of ostracism on aggression. In order to evoke feelings of ostracism, RAs might be asked to falsely inform participants that others did not choose them to be in their group. The RAs’ responsibilities thus involve lying, which has been associated with several negative outcomes, such as discomfort (DePaulo, 2004) and poor health (Kelly, 2012, as cited by American Psychological Association, 2012). In fact, Kelly found that people instructed not to lie over the course of 10 weeks were significantly healthier than people who lied at their normal rate, suggesting that lies that are required as part of a deception study may have negative consequences.

Personal narratives from RAs corroborate the idea that lying to participants can evoke psychological risk. Reflecting on his experience as an RA, Oliansky (1991) described the stress of repeatedly lying to participants in the course of an experiment. As Oliansky described, the deception story was unbelievable in some circumstances, augmenting the already uncomfortable situation for both him and the participant. In fact, some participants would banter with him about the true nature of the story for several minutes. In the end, he reported feeling guilt, shame, and a distaste for research.

2.3. Social Risk

Like physical and psychological risk, RAs’ responsibilities may involve social risk, or risk that hampers relationships with others or social status. RAs are particularly likely to experience social risk if their responsibilities involve engaging in socially unacceptable behavior. When we have presented talks about how RAs can experience social risk, several people have volunteered their own experiences to include in this article. One RA disclosed to us a situation in which his responsibilities involved playing a violent video game in front of other participants. As part of the video game, he hired a (virtual) stripper to give him a lap dance. He reported feeling particularly embarrassed about implementing this task, especially because some of the participants were women in his other classes. Moreover, participants were not told that he was part of the experiment until later that semester, and he reported that he felt that participants may have viewed him as a “pervert” for the whole term.

Another RA (who was also a graduate student) also shared his story of helping create materials for a study. He acted in a video manipulation in which he was depicted as either a gay or a heterosexual instructor, and participants evaluated his efficacy as an instructor. Although he did not mind being in the video initially, he later felt awkward when he became an instructor and the video was still being used for research purposes involving students with whom he regularly interacted. Oliansky (1991), too, discussed the social awkwardness of running into participants that he or his RAs had deceived.

The anecdotes about RAs experiencing risk are not rampant, but enough evidence suggests that risks—physical, psychological, and social—exist for RAs. Therefore, it is important for research supervisors to consider potential risks to RAs and work to minimize such risks.

3. Minimizing Potential Harm to Research Assistants

We sought out resources that would potentially address how to approach the issue of risks to RAs. Like human participants, RAs may experience physical, psychological, and social risk. Because RAs seem to experience risks similar to human participants, we first turned to discipline-specific standards for human participant research and ethics for guidance (e.g., American Psychological Association, 2002; British Sociological Association, 2002) and cross-disciplinary guidelines, such as the Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans (Canadian Institutes of Health Research, Natural Sciences and Engineering Research Council of Canada, & Social Sciences and Humanities Research Council of Canada [CIHR, NSERC, & SSHRC], 2010). The American Psychological Association guidelines did not contain phrasing acknowledging researcher risks, let alone risks specifically for the RA. The Tri-Council Policy Statement does mention the possibility of harms to RAs, to wit:

Risks in research are not limited to participants. In their conduct of research, researchers themselves may be exposed to risks that may take many forms (e.g., injury, incarceration). Risks to researchers may become a safety concern, especially for student researchers who are at a learning stage regarding the conduct of research, and who may be subject to pressures from supervisors to conduct research in unsafe situations. (p. 25)

The British Sociological Association (2002) has a similar statement, noting that, “Social researchers face a range of potential risks to their safety. Safety issues need to be considered in the design and conduct of social research projects and procedures should be adopted to reduce the risk to researchers” (p. 2).

Additionally, we examined the extent to which IRBs within the United States evaluated risks to the RA. We randomly sampled 50 universities within the United States that engage in “high research activity” (according to the Carnegie Research Classification system). Then, two RAs located each University’s IRB website and coded if that IRB required research supervisors to address the potential risks of research to RAs. Of the IRBs sampled, 46 had consistently working websites to code. According to both coders, no IRB website or form required research supervisors to address potential risk to RAs (100% inter-rater agreement). However, it should be noted that this null result was expected. Guidelines regarding human participant research state that it is the responsibility of the IRB to weigh the risks to the participant against the benefits to society (Department of Health and Human Services, 2009). As others have argued, IRBs may not have the infrastructure or resources to consider potential risks to groups beyond participants (Hausman, 2007). As Hausman notes, IRBs are often taxed with responsibilities of protecting human participants already, and the additional responsibility of protecting an additional group may be impractical. Canada’s Tri-Council Policy Statement (CIHR, NSERC, & SSHRC, 2010) does suggest a role for IRBs (research ethics boards or REBs in their terminology):

While it is not a formal part of its responsibilities, an REB may raise concerns about the safety of student researchers as part of its communication to the student researchers, and to their supervisors. Based on the level of risk, the REB may consider referring these concerns for review by an appropriate body within the institution. (p. 25)

But the definition of an “appropriate body” is not provided.

Based upon this analysis, we propose that the responsibility for discussing risks and benefits of research falls upon research supervisors and their RAs. That is, supervisors should be aware that potential risks exist, and RAs should feel as if they can approach their supervisors with concerns about such risks. Unfortunately, RAs may not feel empowered or able to approach their supervisors with concerns. RAs are well aware that research experience, positive letters of recommendation, and high grades are all important criteria for securing positions in graduate school (Sanders, 2012). Similarly, RAs may recognize that a positive job performance has implications for obtaining employment or benefits after graduation. Therefore, RAs may fear that reporting risks would reflect poorly upon them, perhaps resulting in low grades or a withholding of positive work evaluations.

Alternatively, RAs and supervisors may be unaware of these risks, meaning that they are also unaware of the potential need to address such risks. When the RAs in our psychology labs learn about the risks they may experience, most respond that they have never thought about such risks. Recently, one of our RAs even said, “How would we know? We are new at this.” She reported that she deferred to our judgment. As anecdotal evidence, training RAs to code IRB websites for this article was difficult: The null result of any mention of concerns for RA safety consistently made the RAs worry that they were completing the task incorrectly. One RA also reported that it was “scary” that no one seemed to care about her protection.

Unfortunately, research supervisors may share RAs’ naiveté about risks. When we have presented information about RA risks at conferences, for instance, members of the audience often sit wide-eyed and surprised. Evaluation forms from one presentation of this topic at a conference (Naufel, Beike, Heath, & Strickland, 2011) confirm that research supervisors are largely unaware that such risks exist, as conference participants wrote comments like “Never really considered this issue before!” or “Interesting topic I hadn’t considered.” In other words, it appears that research supervisors are unknowing of the risks rather than intentionally callous.

In summary, there are no current guidelines, to our knowledge, delineating how to approach the topic of RA risks that can be applied across disciplines and scenarios. Additionally, it does not seem to be the responsibility of existing institutional structures (e.g., IRBs) to take on this challenge, especially if they are lacking the resources to do so. Instead, we suggest that the responsibility falls primarily upon research supervisors and the RAs themselves, and we offer guidance for how to address RA risks below. If IRBs raise questions about the ethical treatment of RAs during their normal process of review, we propose that those concerns be expressed back to the research supervisor’s superior (e.g., the department chair or manager). If the concerns are not handled properly at that level, the report would go to the official or board that handles research integrity for the institution (i.e., where complaints about research misconduct on the part of academic staff or students would be reported).

4. Research Assistant’s Bill of Rights

We propose that research supervisors take the rights of RAs seriously. In doing so, research supervisors can reduce risks to RAs. To be clear, we are not suggesting that assistants’ rights have heretofore been or are currently being routinely violated. To the contrary, our discussions with colleagues suggest a number of ethical practices (delineated below) regarding RAs are already common, and the types of harm we listed earlier are the exception rather than the rule. Our goal is to codify these “best practices” and to suggest a few additional protections to ensure that all RAs can work in the safest, most rewarding environment. Research supervisors ought to minimize risks and harms and maximize benefits to everyone engaged in the research enterprise.

In order to have a framework for deriving the Research Assistant’s Bill of Rights, we primarily consulted the ethical principles and code of conduct of the American Psychological Association (2002), highlighting portions that were originally written to apply to research participants or clients but that could also apply to RAs. However, RAs are not interchangeable with research participants. The two differ in terms of knowledge about the study’s procedures, interest in the topic being studied, nature of benefits gained, and paraprofessional status. So, we modified guidelines to be more appropriate to the nature of RAs. We also examined the Tri-Council Policy Statement on Research Ethics (CIHR, NSERC, & SSHRC, 2010), the American Educational Research Association’s (AERA) Ethical Standards (1992), and the Nuremberg (or Nuernberg) Code (“Trials,” 1946-1949), as all of these policies contain applicable information regarding the protection of RAs from harm. From a consideration of these sections, we developed 10 principles (see Table 1).

Table 1. Research Assistant’s Bill of Rights

Article

Summary

Article 1.
Right to safety

Research assistants’ risks of harms from participants, environments, and apparati should be minimized.

Article 2.
Right to informed consent

Research assistants have the right to informed consent with regard to their involvement in the research. The informed consent should document the potential risks and benefits associated with the study, unless withholding the purpose of the study is essential for the study’s validity.

Article 3.
Right to refuse to participate in data collection activities that one finds objectionable

Research assistants can refuse to conduct research or engage in other activities that they find objectionable. If a research assistant does find such an activity objectionable, an alternative assignment should be offered.

Article 4.
Right to withdraw

Research assistants have the right to withdraw during a session or study that they have begun, without penalty.

Article 5.
Right to counseling and notification of incident if harm occurs

Research assistants have the right to have an incident reported to the appropriate body (e.g., the research supervisor’s superior or a body that typically handles research integrity) if they experience harm. Research assistants also have the right to counseling.

Article 6.
Right to proper training

Research assistants should be trained before being entrusted with research duties.

Article 7.
Right to feedback

Research assistants should receive feedback about their performance. Students and supervisees should be evaluated on the basis of their actual performance on relevant and established program requirements.

Article 8.
Right to debriefing

Research assistants have the right to know the outcome of studies to which they contributed.

Article 9.
Right to receive benefits for the work performed

Research assistants should receive benefits for the amount of work that they have completed, but they do not have to receive benefits for work from which they have withdrawn.

Article 10.
Right to choose confidentiality in public acknowledgements

Research assistants have the right to keep their name unassociated with their role in the experiment in published reports or presentations of the research.

4.1. Article 1: Right to Safety (see APA Standard 3.04)

RAs have the right to protection from harm while completing their assigned tasks. Possible sources of harm include but are not limited to: (a) physical harm from faulty equipment (e.g., receiving electric shocks from exposed wiring) or a risky environment (e.g., collecting data in a dangerously noisy location without adequate hearing protection), (b) physical harm or threat from distressed or angry participants (e.g., being shoved by a participant after delivering a scripted insult), (c) emotional harm from repeated engagement in distressing tasks (e.g., reading hundreds of participants’ descriptions of traumatic experiences, repeatedly lying during a deception study), and (d) sexual or other harassment from lab mates or the research supervisor (see APA Standards 3.02 and 3.03). Research supervisors should consider and make every attempt to minimize any such potential harm to the RA without increasing the risk to the participants.

For example, research supervisors may be able to change the timing of data collection to a less noisy time of day, or limit the number of traumatic narratives one RA must read. Note that this Article, as well as many of the others, requires some flexibility and forethought on the part of the research supervisor. Not every task can be completed in a safer environment or made less stressful, but research supervisors must begin to think about such alternatives as they design their research. Research supervisors should also train all lab members in the institution’s sexual and other harassment policies, and make clear that these policies will be strictly enforced in the lab.

4.2. Article 2: Right to Informed Consent (see APA Standards 8.02 and 8.07)

To encourage research supervisors to have forethought about a study’s risks to the RA, research supervisors should develop and administer an RA informed consent form prior to commencing a study. This informed consent document should contain information regarding a study’s risks, benefits, and purpose, unless withholding of information from the RA is necessary for a study’s validity. RAs should read and sign the informed consent form prior to engaging in their responsibilities.

We acknowledge, however, that it is often necessary to withhold information from assistants with regard to the precise predictions of a study due to the potential problems of experimenter biases. In such cases, the true purpose of the study should be revealed in a timely way (see also Article 8).

4.3. Article 3: Right to Refuse to Participate in Data Collection Activities That One Finds Objectionable (see APA Standard 8.04.b)

The informed consent process provides RAs with the opportunity to learn about the nature of their work. If the RAs learn that their assigned responsibilities will offend them or distress them unduly, then RAs should not be coerced into working on tasks. Hence, RAs should have the right to refuse to participate in data collection activities they find objectionable or unsafe.

Although Article 1 assures that the research supervisor has considered what may be objectionable or distressing for most people, Article 3 takes into account that RAs may have different levels of comfort. Some very bright and hard-working RAs simply are not comfortable acting as confederates in research; they are too shy or feel they are not good actors, or they may be conscientious objectors to the idea of lying. Rather than firing the RA for refusing to work on a particular study (e.g., refusing to “try” it), which could be seen as punitive, we again strongly encourage that RAs be offered an alternative task. Though assistants do not have infinite choices in the tasks to which they are assigned, there are almost always some alternatives available. We reiterate that research supervisors must begin to consider and build in alternatives for RAs as they design and plan their research activities (see Article 4 for discussion). Research supervisors should also consider appropriate compensation and benefits for alternative assignments.

4.4. Article 4: Right to Withdraw (see APA Standard 8.02.a.2)

Very occasionally, a study will involve unexpected stressful or harmful outcomes, or an RA may find his or her tasks more taxing than he or she had anticipated. For instance, the RA may feel uncomfortable conducting a particular session at a particular time or with a particular participant. On late Friday afternoons in an empty classroom building, a hostile or intoxicated participant presents risks to the RA that are not offset by the benefit of adding one more data point to the study. In such cases, the RA should feel able to halt the session without reprimand. However, RAs do not have the right to neglect their duties. For example, withdrawing is not simply failing to show up for sessions without informing the research supervisor. Withdrawal consists of an official statement or action by the RA that he or she will no longer conduct this session or work on this particular study. The research supervisor ought not penalize the RA for this action; however, he or she may cease further payment or additional benefits if the RA refuses to be transferred to work on a different study or task. Might such an open-ended withdrawal clause lead to widespread abandonment of sessions by RAs? Anecdotally, we have notified the RAs in our labs of their right to withdraw for several semesters now, and we have never had an RA withdraw without warrant.

The primary difference between Article 3 and Article 4 hinges on the issue of the onset of participation. Article 3 indicates that an RA may refuse responsibilities altogether; Article 4 indicates that an RA who has already accepted responsibilities associated with the study may withdraw later. Therefore, the procedures for administering benefits also differ slightly. The procedures for both Articles emphasize that an incentive should in no way be used coercively to make an RA work on a task that he or she finds morally or otherwise objectionable. That is, RAs should not feel that refusing or halting work because of concerns for safety will negatively affect their career.

We offer a procedure for reducing the likelihood of coercion that parallels policies that are similar to situations involving human participants in research. The APA’s ethical principles note that, “When research participation is a course requirement or an opportunity for extra credit, the prospective participant is given the choice of equitable alternative activities” (p. 12). Canada’s Tri-Council Policy Statement (CIHR, NSERC, & SSHRC, 2010) on research ethics asserts:

The participant should not suffer any disadvantage or reprisal for withdrawing nor should any payment due prior to the point of withdrawal be withheld. If the research project used a lump-sum incentive for participation, the participant is entitled to the entire amount. If a payment schedule is used, participants shall be paid proportionate to their participation. (p. 30)

The principles are similar to the scenarios that RAs may face, and therefore the same rules should apply to RAs who are receiving incentives for their services. Specifically, an RA who refuses to even take part in conducting a particular study can receive an alternative assignment (e.g., coding, scoring, or entering data, scheduling participants, etc.). But an assistant who has begun conducting sessions and learns after further involvement that she is unduly distressed by her duties may withdraw without penalty (e.g., she may not be told she has to “make up” for any sessions she cancelled). Instead, she should receive compensation proportionate to the completed duties and be offered an alternative assignment (see also Article 9). These procedures ought to apply to all RAs regardless of whether they receive course credit or payment for their work.

We recognize that being a paid RA is a job, and that people often are required to conduct duties of their jobs that they dislike or find objectionable. We have two responses to this argument. First, the rule asserted by Articles 3 and 4 concern activities that the RA finds morally objectionable or highly unpleasant to the point of physical or emotional pain. It is more stringent than the right of research participants to withdraw for any reason. Many employers do allow employees to opt out of certain tasks that they find morally objectionable, as in the case of the U.S. military allowing people with conscientious objections to be assigned to noncombatant military duties (Department of Defense, 2007). Additionally, one tenet of Canada’s employment policies is that people have the right to refuse unsafe work (Canadian Centre for Occupational Health and Safety, 2008). In other workplaces, job descriptions are fluid and can be adjusted to suit the abilities and preferences of the people currently holding those jobs. A task that originally might fall under Employee A’s job duties may instead be assigned to Employee B if he or she is more skilled at, or experiences more enjoyment of, that task.

Second, we feel that those of us involved in the research enterprise ought to hold ourselves to higher standards than those in other fields of pursuit. All of the major guidelines for dealing with human participants stress the right of humans to be self-determining, including the Nuremberg Code (“Trials,” 1946-1949), the Belmont principles (Belmont Report, 1979), the American Psychological Association (2002), and others. RAs, too, are humans engaged in research, and therefore this human right extends to them. To assert that RAs give up their rights once they accept a paycheck is a dangerous line of reasoning as it violates this basic human right.

4.5. Article 5: Right to Counseling and Notification to the Appropriate Body if Harm Occurs (see APA Standard 8.08.c)

RAs who do experience harm or unforeseen distress should be restored to their former state as soon as possible. Moreover, the incident of harm must be catalogued. We recognize that institutions may not have a mechanism in place for cataloging such complaints, so we tentatively suggest a department chair or manager, the institution’s office that handles research integrity, or the institution’s IRB or equivalent as potential bodies to whom harms should be reported. This body may then consider whether a review of the study’s procedures is warranted.

4.6. Article 6: Right to Proper Training (see AERA Guiding Standard VI; also Nuremburg Code Principle 8)

RAs have the right to receive training in the skills necessary to competently execute the tasks assigned to them. We suggest that skills rather than length of time spent in training be considered the benchmark of “proper training.” RAs who quickly master data entry or lengthy protocols would therefore require fewer hours of training than those who do not master these skills as quickly. Importantly, this right relates to Article 3, as it is in the course of training that RAs are likely to discover that they find certain activities objectionable. In addition, the pursuit of proper training may allow identification of tasks that fall outside the talents, abilities, interests, or comfort level of a particular RA. For example, an RA may be enthusiastic about the idea of conducting a study involving deception, but in the process of training to conduct the protocol he or she and the supervisor may discover that acting skills are simply not in his or her repertoire. An alternative assignment at which the RA demonstrates greater skill (e.g., coding of verbal responses) may then be provided.

4.7. Article 7: Right to Feedback (see APA Standard 7.06; AERA Guiding Standard VI.b.5)

RAs are research workers rather than mere participants in research, and they may be seeking out the experience to enhance their education or learning experience. As such, they are entitled to know the quality of their performance and the degree to which they are making satisfactory progress in their research skills. RAs should be evaluated on the basis of their actual performance on relevant and established program and job requirements.

4.8. Article 8: Right to Debriefing (see APA Standard 8.08)

RAs have the right to know the outcomes and findings revealed by the studies to which they contribute. Too often the end of semester or academic year fails to coincide with the end of data collection or analysis, perhaps leaving assistants unaware of the final results obtained in a study. Research supervisors have the responsibility to stay in contact with former RAs and inform them of the results of the research when it has reached its conclusion.

4.9. Article 9: Right to Receive Benefits for Work Performed (see APA Standard 8.04.a)

If an RA withdraws from further work on a study before that study or semester is completed, the assistant must receive benefits or compensation in proportion to the work he or she has completed. That is, unlike participants, assistants may be denied full benefits if they withdraw (see also Articles 3 and 4).

4.10. Article 10: Right to Choose Confidentiality in Public Acknowledgements (see APA Standards 4.01, 4.05, and 4.07)

RAs have the right to choose the protection of their identity as regards their involvement in the research. Their identity should not be revealed in public reports, presentations, or discussions of the results of studies upon which they worked, without their prior consent. Perhaps this Article of the Bill of Rights will be the most controversial. It is common to acknowledge the hard (often voluntary) work of RAs in a published paper or on a slide in a research presentation. Although we strenuously support the right of RAs to receive all benefits due them (see, e.g., Article 9), we suggest that revealing their identity may have unintended costs. Using one of our earlier examples of potential harm that may have accrued to RAs, imagine a student who has agreed to view sexually explicit material in front of other participants. Possibly, this student has plans for graduate school and beyond, but may not wish to be known as “the guy who agreed to act like a pervert” to those who closely read the psychological literature. Quite simply, RAs may agree to play roles or take on tasks for the good of the research enterprise, the specifics of which they may not wish to be linked to their names. We encourage research supervisors to discuss the benefits and consequences of public acknowledgment of research contributions with their RAs before any reports are published or presented. Of course there may be situations other than published reports that require the names of RAs to be recorded, such as employment records, grant applications, and the like. Although the research supervisor cannot refuse to provide the names of RAs, these documents have a much more limited audience and will therefore entail less potential social risk to an RA.

5. Summary

We propose this Research Assistant’s Bill of Rights not as a completed work, but rather as a starting point for discussion of a topic important for the continued ethical progress of our field. We do not wish to tie research supervisors’ hands, to create a culture of fear or excessive caution, or to create a new system of red tape and paperwork—we too are research supervisors who want to continue our own work in as unobstructed a manner as possible. Therefore, we envision the Bill of Rights as an addition to the APA and other research organizations’ codes of ethics, either formally or informally.

There are a number of ways this preliminary Bill could be put into place. The easiest method is simply for research supervisors in any discipline to list these rights and how they will be implemented in a syllabus given to RAs at the beginning of their tenure as part of the research team. Enrollment in the course or acceptance of employment is therefore evidence of implied consent to involvement in the research and documentation of the rights listed previously. In addition, or alternatively, individual departments could publicize the Bill of Rights on their websites, informing any people who work as RAs that the department affords them the rights so indicated. We encourage research supervisors, labs, and departments across disciplines and nations to work with the Research Assistant’s Bill of Rights and to report to us their experiences with it. In our continued work on this issue, we will revise and modify the Bill to maximize benefits and minimize costs to research supervisors and RAs.

Acknowledgements

We would like to thank Kent Bodily, Alicia Carter, and Aria Gabol for their assistance with this article. We would also like to thank the research assistants and faculty members who volunteered their anecdotes generously.

References

American Chemical Society. (2012). Creating safety cultures in academic institutions: A report of the Safety Culture Task Force of the ACS Committee on Chemical Safety. Retrieved from http://www.acs.org/content/dam/acsorg/about/governance/committees/chemicalsafety/academic-safety-culture-report-final-v2.pdf

American Educational Research Association. (1992). Ethical standards of the American Educational Research Association. Educational Researcher, 21, 23-26.

American Psychological Association. (2002). American Psychological Association ethical principles of psychologists and code of conduct. Retrieved from http://www.apa.org/ethics/code2002.html

American Psychological Association. (2012, August 4). Lying less linked to better health, new research finds. Retrieved from http://www.apa.org/news/press/releases/2012/08/lying-less.aspx

Baird, K. K., & Kracen, A. C. (2006). Vicarious traumatization and secondary traumatic stress: A research synthesis. Counseling Psychology Quarterly, 2, 181-188.

Belmont Report. (1979). The Belmont report: Ethical principles and guidelines for the protection of human subjects of research. Retrieved from http://www.hhs.gov/ohrp/humansubjects/guidance/belmont.html

Bering, J. (2009, January 13). Brave, stupid, and curious: Dangerous psychology experiments from the past. Scientific American. Retrieved from http://www.scientificamerican.com/article.cfm?id=brave-stupid-and-curious

British Sociological Association. (2002). Statement of ethical practice. Retrieved from http://www.britsoc.co.uk/media/27107/StatementofEthicalPractice.pdf

Canadian Centre for Occupational Health and Safety. (2008). OH&S legislation in Canada: Basic responsibilities. Retrieved from http://www.ccohs.ca/oshanswers/legisl/responsi.html

Canadian Institutes of Health Research, Natural Sciences and Engineering Research Council of Canada, & Social Sciences and Humanities Research Council of Canada. (2010). Tri-council policy statement: Ethical conduct for research involving humans. Retrieved from http://www.ethics.gc.ca/pdf/eng/tcps2/TCPS_2_FINAL_Web.pdf

Department of Defense. (2007). Department of Defense: Instruction (DoD 1300.06). U.S. Department of Health and Human Services. Retrieved from http://www.dtic.mil/whs/directives/corres/pdf/130006p.pdf

Department of Health and Human Services. (2009). Code of federal regulations. Retrieved from http://www.hhs.gov/ohrp/humansubjects/guidance/45cfr46.html

DePaulo, B. (2004). The many faces of lies. In A. G. Miller (Ed.), The social psychology of good and evil (pp. 303-326). New York: Guilford.

Harari, H., Harari, O., & White, R. V. (1985). The reaction to rape by American male bystanders. The Journal of Social Psychology, 125, 653-658.

Hausman, D. M. (2007). Third party risks in research: Should IRBs address them? Ethics and Human Research, 29, 1-5.

Jewkes, R., Dartnall, E., & Sikweyiya, Y. (2012). Ethical and safety recommendations for research on perpetuation of sexual violence. Sexual Violence Research Initiative, Medical Research Council, Pretoria, South Africa. Retrieved from http://www.svri.org/EthicalRecommendations.pdf

Kemsley, J. N. (2009, August 3). Learning from UCLA. Chemical Engineering and News, 87(31), 33-34. Retrieved from http://cen.acs.org/articles/87/i31/Learning-UCLA.html

Lankshear, G. (2000). Bacteria and babies: A personal reflection on research risk in a hospital. In G. Lee-Treweek & S. Linkogle (Eds.), Danger in the field: Risks and ethics in social research (pp. 71-90). New York: Routledge.

McCann, I. L., & Pearlman, P. A. (1990). Vicarious traumatization: A framework for understanding the psychological effects of working with victims. Journal of Traumatic Stress, 3, 131-149.

Milgram, S. (1974). Obedience to authority. New York: Harper & Row.

Milgram, S., Liberty, H. J., Toledo, R., & Wackenhut, K. (1986). Response to intrusion into waiting lines. Journal of Personality and Social Psychology, 51, 683-689.

Naufel, K. Z., Beike, D. R., Heath, W. H., & Strickland, D. S. (2011, February). Considering the risks to the undergraduate research assistant. Paper presented at the Annual Meeting of the Society for the Teaching of Psychology at the Society for Personality and Social Psychology Annual Conference, San Antonio, TX.

Oliansky, A. (1991). A confederate’s perspective on deception. Ethics & Behavior, 1, 253-258.

Sanders, C. E. (2012). The graduate school application process: What our students report they know. Teaching of Psychology, 39, 128-132.

Trials of War Criminals Before the Nuernberg Military Tribunals (Vol. 2). (1946-1949). Washington, DC: U.S. Government Printing Office. Retrieved from http://www.loc.gov/rr/frd/Military_Law/NTs_war-criminals.html

Walker, I. (2007). Drivers overtaking bicyclists: Objective data on the effects of riding position, helmet use, vehicle type and apparent gender. Accident Analysis and Prevention, 39, 417-425.

 


Received 15 February 2013 | Accepted 2 August 2013 | Published 1 November 2013