April 1999 // Volume 37 // Number 2 // Commentary // 2COM1
Results? Behavior Change!
Planning programs that result in clientele making changes in their behavior provides challenges to Extension professionals. These include motivating our clientele to adopt new behaviors, supporting individuals as they make changes, determining what constitutes a behavior change, and measuring the degree of change. Time spent documenting behavior change as required by the Government Reporting Act may result in fewer programs but increased impact of educational efforts. In addition to implications for program staff, administrators, too, are challenged by the need to substantiate practice adoption.
"Knowing and not doing is equal to not knowing." This simple comment, recently found inside my fortune cookie, is the cause of my uneasiness about many of the "one-shot" programs we, as Extension agents, are asked to conduct. A "one-shot" lesson has limited impact if it does not result in some "doing" by the participants. Only then does true "knowing" take place.
Some participants, already motivated to change or improve practices, may take action after a program and make a behavior change. Others may be motivated during a single carefully planned presentation to move toward adopting a new practice. The Extension educator may provide additional information and support to assist these individuals adopt best practices. In both cases, follow up evaluation techniques must be administered in order to determine whether or not knowledge has been transferred into "doing."
Unless we follow up with program participants to determine behavioral change, we cannot show that our programs are successful. Legislators agree that impact means behavior change. The Government Reporting Act requires that each Extension professional document numbers of program participants who have changed their practices. These numbers must be recorded in our yearly report of results.
The requirements of the act should be welcomed by Extension professionals. We are challenged to make time in our schedules to document program impact in terms of numbers of clients moving toward adopting practice change. We have the opportunity to justify reducing the numbers of programs we present in order to prove that those we do undertake are effective. We must build time into our programs to provide support to our clientele as they pursue a particular behavior change. As we provide this support, we will become more personally involved with individuals we serve, and will learn how future programs may be changed to make them more effective in encouraging practice adoption.
Behavior Change Takes Time, Perhaps Some Support
If the objective of an Extension program is to help individuals adopt practices that will improve their lives, then presenting a one or two hour program is likely to have minimal impact. Attempts to motivate each participant, explain the steps in a process, provide resources, and allow time to "get started" on each step are not enough to insure behavior change.
Many people need time to collect information relevant to their own situation, apply the knowledge presented and practice the skills learned. Some may need encouragement and support in the form of more information, guided practice, and affirmation that they are making progress as they move toward integrating the new competency into their daily lives. This complete adoption of practice does not usually take place in a "one-shot" presentation or workshop, but instead over a period of time.
The effectiveness of a camp counselor training programs may be judged by observing the performance of counselors during the actual camping experience and comparing it to skills prior to the program. The success of efforts by Extension professionals to insure safe pesticide application practices can be determined by asking those who participated in pesticide applicator training if they used recommended practices new to them the next time they applied pesticides.
If we truly want to determine whether our financial education programs are making an impact, we must document increased savings, or reduced debt, or achievement of another financial goal established at the beginning of the program. The ultimate goal of the Extension educator is to evidence the internalization of practices that we promote.
What constitutes change? Must it be all or nothing?
If the ultimate goal is the habitual performance of a recommended practice, does change only occur when this happens? Does change occur only when a practice has been completely adopted or will evidence of movement toward change count? Prochaska, Norcross and DiClemente (1994) suggest six stages of change: precontemplation, contemplation, information gathering, action, adoption and internalization. If we can document movement from one of these stages to another, according to the authors, we have produced behavior change.
For example, a person may move from not considering any change in behavior (precontemplation) to gathering information that will assist in adopting new behavior. The move from one stage of change to another indicates we have made an impact, even though the practice has not been completely internalized. Prochaska et al. provide guidance in developing check charts or questionnaires that can be used to determine where participants are in the progression of the change process at the beginning, at the end and at a period of time after an educational program. Analysis of these charts will determine what change has occurred.
Challenges for Extension Professionals
The need for documentation of behavior change has significant programming implications for Extension professionals in all areas of specialization. When designing quality education programs that are intended to encourage adoption of new practices, we should ask ourselves the following questions:
1. What are my priorities for programming? These must be clearly defined and based on our own individual strengths and identified gaps in local programs.
2. Have I clearly stated the objectives for the program so that I know the specific behavior change desired, and the time frame for this change to take place?
3. How will I determine if change in behavior has indeed occurred? Evaluation tools must be developed during program planning. We should consider designing these tools to measure change from one stage to another, for example from information gathering to action.
4. How will I motivate participants to begin to change their behaviors?
5. Will time be needed for participants to collect data necessary to take action?
6. Will a follow-up session be needed before action can be taken or will individual counseling be more useful? Has time been scheduled to provide this support? Our concern must be that without support, some clientele may do nothing at all, diminishing the impact of the program.
7. Has time to administer follow-up evaluation tools been built into program plans? This is often done either by mail or telephone.
8. How many of the clientele in a program can I realistically expect to adopt new practices? It will certainly be fewer than the number of participants in the program. Everyone will not be ready to change. However, our goal should be to begin to move our clientele from one stage of change to another in order to maximize program impact in terms of adoption of best practices.
Numbers of programs conducted may have to be decreased in order to make time to provide support, and determine direction and degree of behavior change.
Challenges for Administrators
Administrators must be proactive in assisting Extension professionals with promoting and documenting behavior change.
1. Practice adoption must be defined. Movement toward behavior change must be considered practice adoption. An effort that moves a program participant from contemplating new behavior to gathering information necessary to make the change, shows impact. This definition must be provided before the beginning of the program year in order for effective programs to be designed to promote practice change.
2. Techniques for evaluating behavior change must be part of in service training and support for Extension professionals.
3. Support must be given Extension professionals who reduce the numbers of programs they do whether by dropping programs or refusing invitations to facilitate programs that require time for preparation but that result in little impact.
4. Administrators must accept and even encourage a reduction in the number of face-to-face contacts in turn for an increased number of program participants that report or demonstrate the adoption of or movement toward new practices, thereby improving evidence of the impact of Extension programming.
5. A process must be established to provide Extension professionals the flexibility needed to pursue behavior change or practice adoption rather than new programming.
A Final Note
Many of the suggestions above are already being addressed. They may seem elementary. However, the shift from emphasis on reporting numbers of contacts to reporting measured impact will be a challenge to some of us. Certainly, it has implications as we plan our programs, which should include fewer programs, more follow-up sessions, and specific time dedicated to measuring change. Successful documentation will likely impress legislators and funders and assure the future of Extension programs.
Prochaska, J.O., Norcross, J.C., & DiClemente, C.C. (1994). Changing for good. New York: The Hearst Corporation.