Viewpoints & Discussion:
Research Assistants in Community Research: Overcoming Limitations of Community Relationships, Research Expertise, and Quality Assurance
Index Terms: research assistant; community worker; community research; school-based research
Suggested Citation: Vukotich, C. J., Jr., & Yearwood, G. M. H. (2014). Research assistants in community research: Overcoming limitations of community relationships, research expertise, and quality assurance. Journal of Research Practice, 10(1), Article V1. Retrieved from http://jrp.icaap.org/index.php/jrp/article/view/448/345
In its initial issue of 2014, the Journal of Research Practice featured an article entitled, Building Sustainable Research Engagements (Vukotich, Cousins, & Stebbins, 2014). This article had one brief recommendation regarding selection of research staff:
Universities readily embrace the spirit of diversity with programs to recognize, encourage, and create it. True diversity suggests that the institution reflect the population of the community in which it operates. Research teams should strive to hire staff that reflects the community in which they are working. This is also in keeping with community involvement in CBPR [community based participatory research]. This staff should have a connection to the host institution or school where the research is being done. (Subsection 4.6, “Select Staff Compatible With Host Institution”)
This paragraph was intended to cover an important, but overlooked member of the research team, the research assistant. The research assistant is a key, yet often underappreciated, component of any research project. This person does much of the practical day-to-day work of research (Hobson, Jones, & Deane, 2005; McGinn & Niemczyk, 2013).
The purpose of this viewpoint is to highlight the important role of the research assistant, provide a general review of the position, provide the author’s experience in a novel approach to hiring research assistants, and to stimulate discussion for all investigators to better consider research assistants when designing research projects. This subject should be of general interest, but especially to those who are interested in the original article (i.e., Vukotich et al., 2014).
There is not a lot of literature on the research assistant. Universities sometimes have a job description and you will also find a few on the Web, but they vary widely. Experience in public health research suggests that the research assistant, in all but purely laboratory situations, does some or all of the following:
(a) Conduct epidemiological surveillance
(b) Recruit research subjects or participants
(c) Collect specimens, such as nasal swabs and finger pricks, in the field or clinic
(d) Carry out diagnostic tests, such as a rapid flu test or diabetes test
(e) Transport specimens to the laboratory
(f) Collect data through surveys, interviews, or observations in person or by telephone
(g) Enter collected data into a computer
(h) Make educational presentations or interventions
These tasks can be mundane, but in some situations require specific technical or analytical expertise (Hobson et al., 2005). The research assistant may take a substantial role in the research. They may be the face of the research to institutions, communities and subjects in any project involving human subjects or participants. This is especially true where research is being done in the community.
There are several types of people that you can hire as a research assistant. There are people who make careers of being research assistants. Their status is often ambiguous. They have temporary jobs that last as long as the project funding. They survive by going from project to project. This practice exists in many universities (Hobson et al., 2005). These people are often poorly paid—starting salary is under $20,000 p. a. at the University of Pittsburgh—and have limited career ladders. Experienced people can and do make more.
Research assistants also come from the ranks of both graduate and undergraduate students. Students expect that research will provide a learning experience, which places a burden on faculty to provide one. Graduate salaries are generally fixed, and usually include tuition and a stipend which is sufficient to support a basic living standard. Undergraduates are a bargain, often being paid at work-study rates, or used in the guise of undergraduate research (Silva et al., 2004). Graduate students’ research is often integral to their education, so they are available for work on the same basis as a regular employee. Undergraduates have to go to class, and so are often unreliable to meet the vagaries of the research schedule.
In 2006, the first author started to study influenza in school children from kindergarten to secondary education and has been involved in this study of flu in schools ever since. The Pittsburgh Influenza Prevention Project (PIPP) was the first cluster-randomized, case-control trial study of the impact of non-pharmaceutical interventions for influenza in school children from kindergarten to fifth grade (K-5). PIPP took place in ten K-5 schools with a total of 3,800 students. The work measured the impact of the intervention on reducing absenteeism, illness, and influenza. The work continued with the Social Mixing and Respiratory Transmission Project (SMART) with over 4,000 students in twelve K-12 schools, to investigate how social mixing of students can spread disease and to create models of social contact in schools. SMART is the largest study of social mixing in school children ever conducted and the first to use multiple measurements of mixing in schools, using electronic tags (motes), contact diaries/surveys, and class schedules to create social contact models.
Research assistants were a critical part of both projects and the question of whom to hire was crucial. The work would require several more people than the number of graduate students available and undergraduates would not have enough time and flexibility.
This research project faced the problem of building community trust in order to be successful. We were aware of the work on CBPR by Barbara Israel and others. She states our problem succinctly:
One of the major challenges in conducting CBPR is the understandable lack of trust that often exists between community members and researchers, based on the long history of research that has no direct benefit (and sometimes does actual harm) and offers no feedback of the results to the participants involved. (Israel, Schulz, Parker, & Becker, 2001, p. 185)
Using community workers is one way of addressing the problem (Israel et al., 2001). In this context, community workers are people from the community being studied who are not researchers by training or experience but who can be trained to do the tasks needed.
There are numerous examples of the use of community workers as research assistants, but their roles have often been limited to taking surveys or delivering simple health messages (Israel, Schulz, Parker, & Becker, 1998; Shultz et al., 1998; Simoni, Weinberg, & Nero, 1999; Wallerstein & Duran, 2006). Some reports, however, suggest a broader role for community workers, but not one that would rise to the level expected of a research assistant. Kim et al. suggest their involvement as “lay health advisors (LHA)” to address health disparities (Kim, Flaskerud, Koniak-Griffin, & Dixon, 2005). Green and Mercer also discuss the use of community workers as collaborators in the design and execution of the research (Green & Mercer, 2001). Borders et al. expanded the role of community workers to collect finger-stick blood samples (Borders, Grobman, Amsden, Collins, & Holl, 2007). They all stress the importance of having community-based workers for data collection activities, especially when conducting home visits, as a way to improve participation and to develop trust between the parties involved.
In our own approach, we have drawn on these previous experiences but have gone further and tried to envision a broader role for community workers as research assistants. We hired 14 individuals from the community as full-fledged research assistants. We strove to hire people with one or more connections to the school community in which they were to work:
(a) They live/lived in the school district
(b) Had attended the schools
(c) Have or had children in the district schools
Twelve of 14 staff hired met one or more of our criteria. The two that did not were parents of school-age children, and were volunteers in their children’s schools. In addition, one person had worked in the school system in which they were assigned as a research assistant. Thirteen of 14 had been parent volunteers. The remaining person had been a student in the school system, but had no children.
The reader can find the job descriptions used for research assistants in PIPP and SMART, plus the job description from the University of Pittsburgh at:
smart.pitt.edu/files/supplemental_material.pdf
There is no standard method for measuring the success of our staff choice. The use of community workers as research assistants was a practical response to a problem, and was not designed as an experiment, so no evaluative tool was incorporated into our work at the time it was being conducted. We were just trying to be successful investigators. After the fact, we asked the following questions:
(a) Did the work all get done in a timely manner?
(b) Was the project successful?
(c) Was the Principal Investigator and Co-Investigators satisfied with the outcome?
(d) Were the schools satisfied with the outcome?
(e) Were the staff satisfied with the outcome?
Both PIPP and SMART were simple projects conceptually, but immensely complex in execution. Both were large and involved multiple school buildings. Our research assistants did all the work in the time frame expected of them.
PIPP and SMART staff were required to collect data. They conducted interviews (in person and over the telephone) and proctored surveys, and entered this and other data into spreadsheets and databases. Data collected were of a high quality, and data entry was good. Staff received training. Data collection and entry were reviewed. Quality checks were in place.
PIPP has ended, and one can conclude that it was successful. The project established a definite link between a non-pharmaceutical intervention and reduction in absenteeism (26%) and influenza A (52%) in school children. It is still the only study to link the intervention with reduction of disease (Stebbins et al., 2011). It also comprehensively measured the changes in preventive behaviors in students (Stebbins, Stark, & Vukotich, 2010). These results were significant enough to generate nine published research articles.
SMART is still wrapping up. However, SMART is the largest study of social networks and the spread of disease. It uniquely used multiple technologies to study social networks. There are four articles that are nearing completion, and a few more on the drawing board. As authors involved in the two studies, we would rate the quality of the work as high in both projects. For example, data entry error rates were less than 5%, which is generally acceptable for data being entered in the field. Nasal swab samples taken by research assistants were clinically acceptable, yielding positive results for respiratory virus in 77% of cases. Within the parameters of our project, 77% was a good result. Additional evidence on which our assessment relies consists of the following three findings.
Both projects were considered successful by the funders, the Co-Principal Investigators, and the schools in which they were carried out. After the PIPP grant, the investigators were funded for SMART, and the same team is now starting a new 3-year grant to continue the work. For more information on PIPP and SMART, see http://www.smart.pitt.edu/
The school district involved in PIPP sent a letter signed by the Superintendent which extolled the project, and provided support letters on other funding opportunities. The relationship between the school district and the research staff continues today, although no additional research has been done with this district. The relationship involves collegial consultation, and a continuation to provide support for education programs in the district (i.e., a science fair). Both look forward to future research together. For SMART, the two school districts involved in the research have supported the application for a follow-up grant, and will continue to work with the investigators. Our school partners are happy.
Another measure of success on multiple levels is that PIPP succeeded in translating the basic science into school-based hygiene practice. The school district involved with PIPP adopted the regimen that was tested in PIPP for all their schools on a permanent basis, including extensive use of hand sanitizer. The head of pupil services is convinced that the limited impact of the H1N1 influenza pandemic in their schools is a direct result of adopting hygiene practices from PIPP. As a result of SMART, two of the SMART schools are also adopting hygiene practices to prevent the spread of influenza.
Exit interviews were held with staff from both projects. They reported having a good experience. They were happy to be part of something important, and also to learn new skills. They were recognized for their work in their own community and were proud to be recognized, and to be able to spread good health messages to the community. Their responses frequently included words or phrases like these: “never dull,” “fun,” “an adventure,” and “a challenge.”
Trust is a big issue in communities, especially in minority communities. Our efforts seem to have created trust in the school communities. As a reflection of trust, PIPP had a 96% participation rate. SMART had a participation rate of 93.5% in the first year and 88.9% in the second year. All of these are good. The second year of a research project often sees a drop-off in participation. In the end, community based research workers became ambassadors in their schools, churches and communities about flu, and its prevention, including the value of vaccination, non-pharmaceutical interventions, especially hand sanitizer. One PIPP staff member noted, “When I was hired, I didn’t think much about flu and flu shots, but now I have become a walking encyclopedia to all my friends, family and neighbors.” This was a common refrain in exit interviews with all staff members of PIPP and SMART.
Our research assistants contributed beyond their formal job descriptions. They were able to provide insights into our research subjects because they were people who were similar to the subject population. They were able to help with ongoing issues of research design and execution. This outcome was also observed by Green and Mercer (2011), as noted above (in Section 2).
Using community workers can have disadvantages. These projects can offer good experience to people who may be trying to build or start a resume. However, they are typically part-time and/or temporary jobs by their nature. People are often looking for full time permanent jobs.
Staff turnover can also be an issue. Losing staff members mid-project is challenging because the work must be done with fewer people until a replacement can be found. PIPP experienced turnover. One person had to be fired and one just abandoned the job without notice. Two people moved to better jobs midway through the project. Other project staff, including the first author, was pressed into service until replacements could be found. The lesson learned from PIPP was used in SMART. Staff members were asked to make a commitment (although unenforceable) to stay to the end. They were scrutinized to see if they were likely to quit midway. None did. Simoni et al. (1999) reported turnover problems, including people who just quit without notice.
Some people required more intense training/supervision than others. These were not professional researchers who had been trained by someone else. However, both PIPP and SMART demonstrated that, with appropriate training and supervision, community-based research staff, whether professionally trained or not, can perform at the level of professional research staff.
The training burden when hiring community workers will obviously depend on people’s background. In PIPP, the community workers mostly had some college education, and had a bigger burden than the SMART staff members who were all college graduates. Training in basic clinical skills and in the research subject area was needed for all staff. Other skills such as communications and interviewing depended on the experience of staff. Training on research integrity/ethics, informed consent, confidentiality, and professional conduct was also provided.
Simoni et al. (1999) reported problems with inaccuracy, incompleteness and poor data coding in the conduct of survey interviews. The PIPP and SMART studies did not experience such problems. The lack of research error is a significant element especially since the research assistants in both studies were not professionally trained or experienced researchers. The role of prioritizing training, quality assurance, and data management to reduce errors demonstrates both the importance of training and quality control as good research practices but also indicates that non-professional research assistants can and do acquire the appropriate skills and knowledge to perform important research tasks and responsibilities with a high level of efficiency and accuracy.
Community workers, properly trained, were able to take on the role of research assistants and perform at a high level. They were able to contribute significantly to the research team, and created valuable connections between the research team and the community. These insights were cultivated after the project and were not part of the original research design. The following is a list of possible research protocols and questions for further discussion:
(i) What are the most critical qualifications for hiring a community worker as a research assistant? Is education more important than personality and communications skills? Is recognized community leadership important? Is sales experience important in conducting research?
(ii) Given that in both the PIPP and SMART projects the research assistants had relatively high educational levels, what are the minimum educational standards required for the research work done in projects like PIPP and SMART?
(iii) Given that much of public health research is done in impoverished or low-income communities, how might projects create sustainable employment opportunities for community members connected with research projects?
(iv) How might principal investigators be proactive in their research design to account for and allocate design time and funding to be deliberate in the hiring and training of community members, both as evidence of good research practice but also as a demonstration of commitment to the communities?
(v) How can we develop rubrics for assessing the performance of research assistants as well as community responses to both the research project and staff?
The project described was supported by the National Institutes of Health through Grant Number UL1TR000005. Additional support was provided by Cooperative Agreement number 5UCI000435-02 and 5U01CK000179-02 from the US Centers for Disease Control and Prevention (CDC).
The authors wish to recognize our partner schools and their coordinators for this research: Pittsburgh Public Schools (Janet Yuhasz), Propel Charter Schools (Kristen Golomb), and Canon-McMillan School District (Grace Lani). We also recognize Co-PIs for PIPP (Donald Burke and Samuel Stebbins) and SMART (Shanta Zimmer and Derek Cummings). Thanks to David Galloway for all his help.
The authors have no conflict of interest to disclose.
Borders, A., Grobman, W., Amsden, L., Collins, E., & Holl, J. (2007). Factors that influence the acceptability of collecting in-home finger stick blood samples in an urban, low-income population. Journal of Health Care for the Poor and Underserved, 18, 100-115.
Green, L., & Mercer, S. (2001). Can public health researchers and agencies reconcile the push from funding bodies and the pull from communities? American Journal of Public Health, 91(12), 1926-1929.
Hobson, J., Jones, G., & Deane, E. (2005). The research assistant: Silenced partner in Australia’s knowledge production? Journal of Higher Education Policy and Management, 27(3), 357-366.
Israel, B. A., Schulz, A. J., Parker, E. A., & Becker, A. B. (1998). Review of community-based research: Assessing partnership approaches to improve public health. Annual Review of Public Health, 19, 173-202.
Israel, B. A., Schulz, A. J., Parker, E. A., & Becker, A. B. (2001). Community-based participatory research: Policy recommendations for promoting a partnership approach in health research. Education for Health, 14(2), 182-197.
Kim, S., Flaskerud, J. H., Koniak-Griffin, D., & Dixon, E. L. (2005). Using community-partnered participatory research to address health disparities in a Latino community. Journal of Professional Nursing, 21(4), 199-209.
McGinn, M. K., & Niemczyk, E. K. (2013). Research practice in research assistantships: Introducing the special issue on research assistantships. Journal of Research Practice, 9(2), Article E2. Retrieved from http://jrp.icaap.org/index.php/jrp/article/view/384/310
Schulz, A. J., Parker, E. A., Israel, B. A., Becker, A. B., Maciak, B. J., & Hollis, R. (1998). Conducting a participatory community-based survey for a community health intervention on Detroit’s east side. Journal of Public Health Management & Practice, 4(2), 10-24.
Silva, T. D. N., Aguiar, L. C., Leta, J., Santos, D. O., Cardoso, F. S., Cabral, L. M., . . . Castro, H. C. (2004). Role of the undergraduate student research assistant in the new millennium. Cell Biology Education, 3(4), 235-240.
Simoni, J. M., Weinberg, B. A., & Nero, D. K. (1999). Training community members to conduct survey interviews: Notes from a study of seropositive women. AIDS Education and Prevention, 11(1), 87-88.
Stebbins, S., Cummings, D. A. T., Stark, J. H., Vukotich, C. J., Jr., Mitruka, K., Thompson, W. W., . . . Burke, D. S. (2011). Reduction in the incidence of influenza A but not influenza B associated with use of hand sanitizer and cough hygiene in schools: A randomized controlled trial. The Pediatric Infectious Disease Journal, 30(11), 921-926.
Stebbins, S., Stark, J. H., & Vukotich, C. J., Jr. (2010). Compliance with a multilayered nonpharmaceutical intervention in an urban elementary school setting. Journal of Public Health Management and Practice, 16(4), 316-324.
Vukotich, C. J., Jr., Cousins, J., & Stebbins, S. (2014). Building sustainable research engagements: Lessons learned from research with schools. Journal of Research Practice, 10(1), Article M1. Retrieved from http://jrp.icaap.org/index.php/jrp/article/view/381/324
Wallerstein, N. B., & Duran, B. (2006). Using community-based participatory research to address health disparities. Health Promotion Practice, 7(3), 312-323.
Published 23 April 2014
Copyright © 2014 Journal of Research Practice and the authors