The Journal of Extension - www.joe.org

June 2013 // Volume 51 // Number 3 // Feature // 3FEA2

Research Use by Cooperative Extension Educators in New York State

Abstract
A Web-based survey of 388 off-campus Cornell Extension educators in New York State examined their attitudes toward research, sources of research-based information, knowledge and beliefs about evidence-based programs, and involvement in research activities. Strong consensus emerged that research is central and that educators are capable of reading and applying it. The Web is their most frequent source. Time is the greatest barrier. Educators know about evidence-based programs but do not necessarily regard them as superior. Research experience is common among educators in agriculture, much less in 4-H. New methods are needed to connect educators with faculty.


Stephen F. Hamilton
Bronfenbrenner Center for Translational Research
Department of Human Development
Cornell University
Ithaca, New York
sfh3@cornell.edu

Emily K. Chen
Department of Human Development
Cornell University
Ithaca, New York
ek455@cornell.edu

Karl Pillemer
Bronfenbrenner Center for Translational Research
Department of Human Development
Cornell University
Ithaca, New York
kap6@cornell.edu

Rhoda H. Meador
Center for Health Sciences
Ithaca College
Ithaca, New York
rmeador@ithaca.edu

Introduction

The Extension system exists to disseminate the findings of research beyond the academic community to practitioners, policy makers, and the general public. Extension educators thus serve as a bridge between scholars and the wider community. For example, scientists may find a way to apply pesticides more precisely or discover the benefits of serving low-fat milk to children. Extension staff then educate farmers or parents, respectively, about the new findings. These examples illustrate what Nutley, Walter, & Davies (2007) called the "knowledge-driven model" of research utilization in policy and practice.

Dating from the beginning of the Extension system, educators have been considered experts in taking the latest university-generated research and making it available to interested publics in the form of science-based information and programs. Expertise in what has come to be called research translation (Wethington & Dunifon, 2012) is thus a hallmark and necessary core competency of the Extension educator's work (Harder, Place, & Scheer, 2010). In many educational and service fields (from nursing, to psychotherapy, to law enforcement), investigations have taken place regarding how practitioners use research in their work (Hemsley-Brown, 2004). However, no systematic investigations have addressed how Extension county educators access research findings, interact with research faculty, or encounter barriers to the use of research findings in their work. The lack of research on this topic is surprising; as many states' Extension systems face declining numbers of Extension faculty appointments, county educators have a greater responsibility to maintain connections to their university's research base and require skills to stay informed about relevant research findings.

In this context, evidence-based programs (EBPs) are becoming increasingly prominent to bridge the gap between research and practice (Hill & Parker, 2005; Molgaard, 1997). EBPs are programs or curricula that have been rigorously tested to validate their effectiveness (Dunifon, Duttweiler, Pillemer, Tobias, & Trochim, 2004). Grounded in clearly defined theoretical frameworks, EBPs convey research-based content using empirically validated delivery methods. The federal government and other major funders now often require grantees to use EBPs rather than locally developed "homegrown" programs. This pressure, combined with the availability of EBPs for many topics salient to Extension's constituents, make EBPs a valuable tool for Extension to disseminate academic research to communities.

Although Extension is primarily a conduit for disseminating research-based knowledge, it is also a system that can enable wider community participation in research. There are a variety of Extension programs in which the users of the program also help collect data to inform practice; examples include Integrated Pest Management (Norton, Rajotte, & Gapud, 1999) and the Teen Assessment Project survey program conducted by the University of Wisconsin (Small & Hug, 1990). Combining a firm grounding in their communities with strong ties to their land-grant universities, Extension educators are ideally situated to disseminate EBPs and collaborate on research (originating from both academics and communities), as well as to communicate research results.

However, despite how integral research and research awareness are to the responsibilities of Extension educators, our review of the published literature encountered no studies that explored facilitators and barriers to accessing research (and research faculty) in the Extension system. To address this gap, the current study was undertaken to assess research awareness and involvement across the New York State Extension system and to investigate differences in these components among program areas.

We conceptualized research in the work of Extension educators as having three major components: 1) awareness of and attitudes toward research in Extension; 2) knowledge of research results, especially as embodied in evidence-based programs (EBPs); and 3) direct involvement in research activities. Our goal was to assess each of these components in a survey of New York State Extension educators. We analyzed responses separately by program area: 1) Agriculture and the Environment, 2) Family and Consumer Sciences, and 3) 4-H Youth Development, because the experience of the New York Extension system suggested that differences would exist among areas in research awareness and use.

Specifically, we hypothesized that research utilization and connections to county researchers would be greater among agriculture and environment educators and those in family and consumer sciences, when compared with 4-H youth development educators.

In agriculture and the environment, educators need to be recognized as scientific experts to be useful and credible to the people they work with, who are themselves often well informed about research. Educators in the family and consumer sciences often work with agency professionals and tend to be specialists in one or two areas, such as nutrition or child care.

4-H youth development educators, in contrast, work on a very wide array of subjects, drawn from disciplines in both human ecology and agriculture. Keeping up with a variety of fields from animal husbandry to textiles is difficult and may be impossible. Moreover, 4-H educators' main responsibilities do not necessarily require scientific expertise in the topic around which a 4-H club is organized; their generic field is youth development, one that is less clearly defined and in which research resources are far more limited than in areas such as nutrition or animal science. Therefore, we anticipated that 4-H youth development educators would report lower confidence regarding understanding and using research than the other two groups.

 Methods

We conducted a survey of "research readiness" among off-campus Cornell Cooperative Extension educators in New York State. In the study, research readiness included knowledge of sources of research findings, awareness of evidence-based programs, and connections to research faculty on the central campus. We focused on the following questions

  1. To what extent do Extension educators view research as central to their work?
  2. How do educators typically learn about research findings?
  3. Do educators feel they have adequate training, experience, and confidence about reading, interpreting, and applying research?
  4. How do educators regard evidence-based programs?
  5. To what extent are educators involved in research activities as an integral part of their work?

A web-based survey was administered in April 2010. The sampling frame included all 490 educators in Cornell Cooperative Extension, including community educators, program leaders, and issue leaders. The response rate was excellent: 81.2% (n=388). In all 58 associations, at least half of educators responded. In 44% of associations, 100% of educators responded.

Measures

Attitudes Toward Research and Its Use

We developed four items (listed in Table 1) that tapped educators' attitudes about the role research plays in their work. Response choices were on a four-point scale, from Strongly Agree to Strongly Disagree.

Sources of Research-Based Information

To assess ways in which Extension educators obtain research-based information, we adapted a scale developed by Pravikoff and colleagues (Pravikoff, Tanner, & Pierce, 2005) to assess use of empirical evidence in daily practice. The scale asked how frequently respondents used various resources (listed in Table 2) and to name Cornell researchers whose work they were familiar with. For faculty members identified, respondents were asked if they had ever contacted that individual. Following the questions regarding actual use, respondents were asked to rank the information sources in order of their preference for each source.

Barriers to Using Research

We adapted a measure developed by Pravikoff and colleagues (2005) that enumerates potential barriers to finding such information. Items assessed comfort in reading research articles and understanding statistics, problems accessing materials (e.g., because of lack of a computer or access to a library), and other barriers (listed in Table 4). Response choices were on a four-point scale, from Strongly Agree to Strongly Disagree.

Evidence-Based Programs

We developed four items to assess educators' confidence in understanding EBPs, their familiarity with programs or the ability to find EBPs, and their endorsement of them (Table 5). Response choices were on a four-point scale, from Strongly Agree to Strongly Disagree.

Involvement with Research Activities

Five items measured educators' experience with research methods and research projects (Table 6). Responses choices were Yes or No.

Sample Description

Respondents are described in Table 1. Seventy-four percent were female. Ninety percent identified themselves as Caucasian. Fifty-eight percent had earned a master's degree, 39% a bachelor's degree, 2% an associate's degree, and 2% a Ph.D. In terms of years of experience in Cornell Cooperative Extension, approximately a third of the sample fell into each of three categories: 3 years or less, 4-9 years, and 10 or more years.

Respondents indicated their major area of responsibility by choosing one of four categories. Improved Quality of Life for Individuals and Families includes nutrition and health and children and families. Because most of this work draws on the College of Human Ecology, we abbreviate this group of educators as HE; they constituted 36% of respondents. Two of the categories draw primarily on the College of Agriculture and Life Sciences: Agriculture and Food Systems Sustainability, and Natural Resources and Environment. We combined these two and abbreviated them as AG; they accounted for 38% of respondents (26% and 12%, respectively). We designate as 4-H those educators who selected Youth Development as their major area: 26% of respondents. When educators indicated working in more than one area, we assigned them to the category with the largest time commitment.

Table 1.
Respondent Characteristics

  Total
Associations (n) 56
Respondents (n) 388
Program Area
AG (n) 147
HE (n) 138
4-H (n) 103
Gender
Male 26.0%
Female 74.0%
Race
Caucasian 89.7%
All other races 10.3%
Education
Associates Degree 1.6%
Bachelor's Degree 38.9%
Master's Degree 57.7%
PhD 1.9%
Years in Extension
3 years or less 32.5%
4 to 9 years 31.9%
10 years or more 35.7%

Results

Results are reported by survey topic below. In general, percentages refer to the percent of all respondents who agreed or strongly agreed with a survey item, or who use or prefer a given research resource. We hypothesized that there would be differences in educators' "research readiness" by program area. We assessed differences by calculating a chi-square statistic for the responses of educators in three program areas. When there were statistically significant differences (less than p=.05) between educators from different program areas (i.e., differences observed in responses are not likely due to chance), responses are reported separately.

Attitudes Toward Research and Its Use

Table 2 shows that 92% of New York CE educators agreed that research is relevant to their day-to-day work and fully 98% agreed that keeping informed about research is important to doing a good job. Although all educators reported that research is valued in their work, fewer 4-H educators (74%) agreed than AG educators (88%). AG educators were least likely to agree that more money should be spent on practice and less on research: 27% compared to 48% of HE and 66% of 4-H. 

Table 2.
Attitudes Toward Research and Its Use

Survey Item Percent of respondents who agree or strongly agree with survey item
Total Program Area
AG HE 4-H
I feel that to do a good job, Extension staff must keep informed about research in their program areas. 97.7% 97.9% 98.6% 96.1%
Research is relevant to my real day-to-day work. 91.5% 91.1% 92.8% 90.3%
Research is valued in my work.* 82.6% 87.5% 84.2% 73.5%
More money should be spent on programming and less money should be spent on research.*** 45.1% 27.1% 48.4% 65.6%
*p<.05, ***p<.001

Sources of Research-Based Information

Sources of information used by educators and preferences for these sources are shown in Table 3. When asked which sources of research they used monthly or more often, three-quarters (74.6%) of educators named the Web. Newsletters with reports on current research were next (59.4%), followed by peers or colleagues (49.9%), research reports or fact sheets (45.7%), Cornell websites (44.2%), Cornell publications (34.6%), professional or scientific journals (32.2%), reference texts or manuals (28.8%), Cornell faculty (16.8%), other Cornell staff (16%), and conferences or workshops (14.5%).

In general, AG educators reported relying on original sources of research (journals and research reports) and on Cornell faculty more than educators in the other two program areas. Eighty-seven percent of AG educators reported having directly contacted a Cornell researcher, compared to 65% of HE and 50% of 4-H educators. AG educators also reported higher use of Cornell publications and websites.

Frequent attendance at Cornell in-service training (twice annually or more) was highest for HE educators (50%) and lowest for 4-H (17.6%), with AG falling in between (32.2%). 4-H educators expressed the highest preference for learning about research from peers. Preference for workshops as a source of research information was highest among 4-H and HE educators, but 82.4% of 4-H educators reported actually attending workshops once a year or less.

Table 3.
Sources Used to Learn About Research Findings

Survey Item Total Program Area
AG HE 4-H
Respondents who report using research source once a month or more frequently
World wide web* 74.6% 81% 74.6% 65.3%
Newsletters*** 59.4% 67.1% 65.7% 39.6%
Peers or colleagues** 49.9% 60% 46% 40.6%
Research reports or fact sheets*** 45.7% 60% 45.7% 25.5%
Cornell website*** 44.2% 62.1% 33.6% 33%
Cornell publications*** 34.1% 57.2% 27.8% 9.7%
Professional or scientific journals*** 32.2% 32.9% 44.1% 15%
Reference text or manual** 28.8% 37.7% 27.7% 17.6%
Contact with Cornell faculty*** 16.8% 29.9% 10.9% 5.9%
Contact with Cornell staff other than faculty*** 16% 25.3% 9.6% 10.9%
Conferences or workshops 14.5% 16.3% 14.5% 11.9%
Attendance at Cornell in-service training***
1x/year or less frequently 65.4% 67.8% 50% 82.4%
2x/year or more frequently 34.6% 32.2% 50% 17.6%
Have contacted a Cornell researchers directly***
No researchers contacted 30.7% 12.9% 34.8% 50.5%
Contacted 1 or 2 researchers 17% 7.5% 23.2% 22.3%
Contacted 3 or more researchers 52.3% 79.6% 42% 27.2%
Most preferred source for learning about research
World wide web* 24.3% 20% 22.1% 33.7%
Conferences or workshops 21.8% 19% 23.4% 23.5%
Peers or colleagues 14.6% 13.1% 11.7% 20.6%
Contact with Cornell faculty* 13% 18.6% 9.6% 9.7%
Professional or scientific journals 9% 9.9% 11.8% 3.9%
Research reports or fact sheets 8.5% 9.5% 11.7% 2.9%
Newsletters 6.2% 6.9% 8.8% 1.9%
Contact with Cornell staff other than faculty 1.9% 2.1% 0.8% 3%
Reference text or manual 1.8% 2.1% 1.5% 2%
*=p<.05, **=p<.01, ***=p<.001

Research Competencies and Barriers

Table 4 shows respondents' endorsements regarding barriers to research and its use. A large majority of educators (82.7%) reported that they did not have difficulty understanding research articles. An item about the difficulty of understanding statistical analyses was rejected by just over half (56.3%) of the educators, indicating that statistics are a barrier for almost half. About two-thirds of respondents said they were confident about their ability to interpret research results (64.2%), and even more (84.0%) said they were confident about their ability to use research. Responses to these questions were similar across program areas. This self-reported proficiency may result in part from the fact that more than three-quarters of respondents (77.6%) said they had taken a course in research methods as part of their undergraduate or graduate training.

However, nearly three-quarters (73.2%) agreed that they did not have enough time to read research. When asked whether there is a lack of relevant research, 72% disagreed. Difficulty gaining access to research was cited as an issue by nearly one-third of respondents (29.0%). This difficulty appears to be in finding research; it is not a matter of lacking computer access or computer skills; nearly all respondents disagreed that this was a problem (97.3% and 93.7%, respectively).

Table 4.
Research Competencies and Barriers

Survery Item Percent of respondents who agree or strongly agree with survey item
Total Program Area
AG HE 4-H
I feel confident in my ability to use research findings in practice. 84% 88.1% 84.6% 77.5%
There is not enough time to read research. 73.2% 76.6% 67.9% 75.5%
I feel confident in interpreting research findings using descriptive statistics. 64.2% 67.1% 65.2% 58.8%
Statistical analyses are not understandable.* 43.7% 37.8% 42.5% 53.4%
I have difficulty accessing research materials. 29.3% 26.1% 26% 37.9%
There is a lack of relevant research findings. 28% 27.9% 27.9% 28.4%
I have difficulty understanding research articles. 17.3% 12.7% 19% 21.4%
I don't have library access.* 10.8% 5.6% 13.3% 14.7%
I don't have the computer skills necessary for accessing research. 6.3% 6.2% 5.9% 6.9%
I lack access to a computer. 2.6% 2.1% 3.7% 1.9%
*=p<.05

Evidence-Based Programs

Educators expressed confidence in their ability to distinguish evidence-based programs from others (71.3%; Table 5). AG and HE staff indicated that they could differentiate between EBPs and non-EBPs at significantly higher rates than 4-H staff (75.7% and 78.1% compared to 56.1%). Educators expressed confidence about knowing where to find information about EBPs (69.2%), but 4-H educators agreed that they could find such information significantly less than HE (57.1% compared to 76.9%), with AG educators falling in between (70.5%). The same pattern is seen when respondents were asked if they know about a specific EBP related to their work: HE leads (74.6%), followed by AG (68.1%), then 4-H (53.6%). A large proportion (43.5%) said they did not think EBPs are necessarily better than programs educators develop themselves.

Table 5.
Evidence-based Programs

Survey Item Percent of respondents who agree or strongly agree with survey item
Total Program Area
AG HE 4-H
I feel confident that I can tell an evidence-based program from one that is not evidence-based.*** 71.3% 75.7% 78.1% 56.1%
I know where to go to find information on evidence-based programs in the areas I work on.** 69.2% 70.5% 76.9% 57.1%
I know about several specific evidence-based programs related to my work areas.** 66.6% 68.1% 74.6% 53.6%
I don't think evidence-based programs are necessarily better than programs Extension educators develop themselves. 43.5% 42.6% 38% 51.6%
**=p<.01,***=p<.001

Involvement with Research Activities

Finally, we asked educators about their actual engagement in research projects. AG educators were more likely than educators in the other program areas to report having had jobs that involved participation in research projects (82.3% compared to 61.8% HE and 48.5% 4-H; Table 6). Reports of ever having participated in a research project or of having participated in a research project as an Extension educator followed the same pattern, with AG educators highest and 4-H educators lowest. Ninety-five percent of educators who reported experience with research projects said it had been a positive experience.

Table 6.
Involvement with Research Activities

Survey Item Percent of respondents who replied "yes" to survey item
Total Program Area
AG HE 4-H
Positive experience with research projects 95% 94.3% 95.9% 95%
Had course in research methods 77.6% 72.8% 81.2% 79.6%
Having had jobs that involved participation in research projects*** 66.1% 82.3% 61.8% 48.5%
Ever having participated in a research project 52.9% 66.7% 52.2% 44.3%
Having participated in a research project as part of Extension*** 49.2% 65% 47% 30%
***=p<.001

Discussion

Results from the survey reported here suggest that county-level educators are ready and willing to ground Extension even more firmly in research. Their attitudes, experiences, and preparation are properly aligned with this goal. However, several challenges must be overcome to achieve it.

Educators' responses demonstrate a strong degree of interest in research and recognition of its value. Educators expressed confidence in their ability to read and understand research reports, including journal articles, and to learn about EBPs. They report consulting research frequently. The nearly unanimously positive view of research only faded when spending for research was directly counterposed to spending for programs. In that case, many educators preferred funding programs, which may not be surprising from respondents whose job is to run programs in a difficult economic environment.

Two trends in the response patterns raise concerns. First, analysis of the results by program area highlighted that 4-H educators reported less proficiency in research than educators in HE and AG and less contact with faculty. This pattern may reflect idiosyncrasies in the Cornell Extension system. Many more professorial faculty members work in areas supporting AG and HE, whereas only a few specialize in youth development. As a result, 4-H educators' opportunities for contact with researchers are limited, whatever their preferences might be. Additional research is needed to determine whether these differences by program area are a New York State phenomenon or hold nationally. The differences we discovered point to the need to assess research-readiness by program area. Greater understanding of such differences will be useful in targeting research training to specific groups of educators and in making appointments.

The second trend that deserves further research is educators' dependence on and preference for the Internet and peers as sources of research. All educators listed the Web as their primary source of research information, followed by newsletters and peers. It is understandable that educators would look to their peers for information about programs, but peers may not be the optimal source of scientific information. Similarly, the Web is not always an accurate or reliable source of valid research. It was not feasible in our survey to ask what kind of information educators glean from peers or what websites they visit to find research. Understanding how educators assess the quality of information—whether from the Web or from peers—and how research-based information travels within Extension is a promising area of research. Future research of this nature should be guided by the empirical and conceptual work of such leaders as Rogers (2003) on diffusion, Fixsen, Naoom, Blase, Friedman, and Wallace (2005) on implementation, and Nutley, Walter, and Davies (2007) and Weiss (1979) on research utilization.

A broader implication of the study's findings is that Extension should develop and test a variety of new methods to connect faculty researchers with county-level educators. Electronic methods are clearly essential but not likely sufficient. Optimal combinations of face-to-face and remote communication should be identified. Current research on communication and online learning can inform such efforts. Evaluation research should inform continuous improvement of these efforts. As effective methods are developed, both content and techniques should be shared among land-grant universities.

Although the issue was not probed in the survey, it is worth asking how much of the gap between research and practice is attributable to a gap between researchers' and practitioners' priorities. Questions deemed important in the disciplines may not respond to practitioners' needs. Land-grant universities, which combine research capacity with the responsibility and ability to respond to practitioners' priorities, have unique capacity to link these two domains. Comparative surveys of research faculty regarding attitudes toward Extension and outreach would shed light on this issue.

In conclusion, we propose that what happens between the production of new knowledge in universities and its use in "the real world" can no longer be treated as a black box. That process must be examined with care and subjected to systematic and continuous improvement. The results of the present study provide encouraging data regarding interest in research and awareness of its importance among educators. Building on these strengths, the Extension system can and should work to create a stronger link between the creation of knowledge and its use by county educators.

References

Dunifon, R., Duttweiler, M., Pillemer, K., Tobias, D., & Trochim, W.M.K. (2004). Evidence-based Extension. Journal of Extension [On-line], 42(2) Article 2FEA2, Available at: http://www.joe.org/joe/2004april/a2.php

Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida. The National Implementation Research Network (FMHI Publication #231).

Harder, A., Place, T., & Scheer, S. D. (2010). Towards a competency-based extension education curriculum: A Delphi study. Journal of Agricultural Education, 51, 44-52. Retrieved from: http://www.jae-online.org/attachments/article/84/Vol%2051%20No%203%20pg%2044%20-%20Harder.pdf

Hemsley-Brown, J. V. (2004). Facilitating research utilisation: A cross sector review of the research evidence. International Journal of Public Sector Management, 17(6), 534-553.

Hill, L. G., & Parker, L. A. (2005). Extension as a delivery system for prevention programming: Capacity, barriers, and opportunities. Journal of Extension [On-line], 43(1) Article 1FEA1. Available at: http://www.joe.org/joe/2005february/a1.php

Lerner, R. M., Lerner, J. V., & colleagues (2011). The positive development of youth: Report of the findings from the first seven years of the 4-H Study of Positive Youth Development. Medford, MA: Tufts University. Retrieved from: http://ase.tufts.edu/iaryd/researchPositive4HPublications.htm

Molgaard, V. K. (1997). The extension service as a key mechanism for research and services delivery for prevention of mental health disorders in rural areas. American Journal of Community Psychology, 25, 515-544.

Norton, G. W., Rajotte, E. G., & Gapud, V. (1999). Participatory research in integrated pest management: Lessons from the IPM CRSP. Agriculture and Human Values, 16(4), 431-439.

Nutley, S., Walter, I., & Davies, H. (2007). Using evidence: How research can inform public services. Briston, UK: The Policy Press.

Pravikoff, D. S., Tanner, A. B., & Pierce, S. T. (2005). Readiness of US nurses for evidence-based practice: Many don't understand or value research and have had little or no training to help them find evidence on which to base their practice. AJN The American Journal of Nursing, 105(9), 40-51.

Rogers, E. M. (2003). Diffusion of innovations, Fifth edition. New York: Free Press.

Sabir, M., Breckman, R., Meador, R., Wethington, E., Reid, M. C., & Pillemer, K. (2006). The CITRA Research-Practice Consensus Workshop Model: Exploring a New Method of Research Translation in Aging. The Gerontologist, 46, 833-39.

Small, S. A,. & Hug, B. E. (1991). The teen assessment project: Tapping into the needs and concerns of local youth. Journal of Extension [On-line], 29(1) Article 1FEA7. Available at: http://www.joe.org/joe/1991spring/a7.php

Trumbull, D. J., Bonney, R., Bascom, D., & Cabral, A. (2000). Thinking scientifically during participation in a citizen-science project. Science Education, 84(2), 265-275. doi: 10.1002/(SICI)1098-237X(200003)84:2<265::AID-SCE7>3.0.CO;2-5

Wethington, E., & Dunifon, R. (eds.) (2012). Research for the public good: Applying the methods of translational research to improve human health and well-being. Washington: American Psychological Association.

Weiss, C. H. (1979). The many meanings of research utilization. Public administration review, 426-431.