Summer 1989 // Volume 27 // Number 2 // Feature Articles // 2FEA5
What's an easy, simple, reliable, and valid way to measure whether a program has impact? This question is asked frequently by Extension agents and specialists as they respond to accountability needs within the Extension organization. The "post-then-pre" method of self-report evaluation offers one solution for documenting behavior change. The data collection instruments are relatively easy to develop, use, and analyze. Results are credible and indicate program impact even though the process seems backwards.
Problems with Typical Approach
In Extension, a typical approach has been to use a pretest-posttest research design to document behavior change. However, in certain types of self-report program evaluation, pretest-posttest comparison results may be an inaccurate assessment of instructional impact because participants may have limited knowledge at the beginning of a program that prevents them from accurately assessing baseline behaviors. By the end of the program, their new understanding of the program content may have an impact on the responses on their self-assessment. If a pretest was used at the beginning of the program, participants have no way to correct an answer at the end of the program if they made an inaccurate assessment in the baseline data.1
The evaluation problem, then, is that a pretest taken at the beginning of an Extension education program may be invalid because participants have limited knowledge in responding accurately to the questions being asked on the pretest. Consider the following pretest question for a nutrition program: "Do you include one food rich in vitamin C in your diet daily?" To answer this question accurately, the respondent must have some idea of which foods are rich in vitamin C. A participant who doesn't know which foods are rich in vitamin C may overestimate vitamin C intake on the pretest. After actually participating in the Extension program and learning about foods rich in vitamin C, the participant can validly answer the question.
Now suppose the participant has increased vitamin C intake as a result of the program. On the posttest, aimed at measuring this change in behavior, the participant reports the same level of vitamin C intake as reported on the pretest. The posttest level is accurate, but because the pretest was an overestimate (due to the participant's lack of knowledge), it will appear that no change in behavior has occurred between pretest and posttest. Such an evaluation result makes it appear that the program had no effect on behavior when, in fact, the program significantly increased vitamin C intake.
The post-then-pre design corrects this problem. The problem is handled by not giving a pretest at the beginning of the program. Then, at the end of the program, the participant answers two questions. The first question asks about behavior as a result of the program. This is the posttest question. Then the participant is asked to report what the behavior had been before the program. This second question is really the pretest question, but it's asked after the program when the participant has sufficient knowledge to answer the question validly. That's why this approach is called post-then-pre.
Table 1 illustrates both the traditional pre-then-post approach for the vitamin C example and the post-then-pre approach. In the pre-then-post example, the participant incorrectly reported "often" eating vitamin C rich foods on the pretest when the accurate answer should have been "seldom." The valid posttest answer is "often." With the pre-then-post approach, Table 1 shows no behavior change. However, with the post-then-pre approach, behavior change is demonstrated because the pretest response is "seldom" and the posttest response is "often."
The "post-then-pre" design accounts for changes in learners' knowledge by allowing participants to first report present behaviors (post); and then rate how they perceived these same behaviors just before taking the course (then pre). The retrospective pretest at the end of the program is more accurate because it's answered in the same frame of reference as the posttest. Thus, the problem of what's called "response-shift bias" in self-report, pre-post designs is minimized.2
Using Post-Then-Pre for a Nutrition Course
Finding ways of reaching today's audiences that will result in healthier nutritional habits is a challenge for nutrition educators. Recognizing that simply providing cognitive-based education won't necessarily mean positive dietary changes, a course entitled "Eating Today for a Healthier Tomorrow" (ETHT) was developed. The focus was to provide a learning environment that facilitated behavior change by involving participants in behavioral goal setting, providing correct nutritional facts, and teaching processes for evaluating dietary information.
ETHT, targeted to adults, enabled participants to use food and nutrition practices that help reduce the risks of heart disease, cancer, osteoporosis, and obesity. An instructor's manual provided directions for teaching the material and a participant's manual included all the basic dietary information. An Extension home economist and a registered dietitian taught the course cooperatively in six sessions of 2 1/2 hours each.
To assess the impact of ETHT, Extension specialists and agents developed a post-then-pre self-report instrument for instructors to use. Table 2 illustrates some of the 30 practices that represent types of behaviors to be maintained or changed as a result of taking the course. As part of the overview in the first session, instructors gave participants a copy of the 30-item list to keep in their course notebooks. At the final session, participants were asked to complete the self-evaluation instrument indicating how frequently they did each practice before the course and at the end of the course.
Coding instructions were developed so staff could enter the data into a microcomputer file. Using SAS3 on a mainframe, a computer program was written to analyze the data and summarize the results for each practice. A chart format was used for the computer output and included the mean scores for post (present score) and then (pre score) along with the t-value and probability level of a paired t-test. The agents then graphed the results for their class on a standard form (Figure 1).
Each of the Extension agents (13 in 1987 and 16 in 1988) who taught the ETHT course received the results for their class. Data from all courses taught across the state were collapsed into a statewide summary for use by the nutrition specialist and other Extension staff.
Table 1. Comparing pre-post and post-then-pre scoring.
|Hypothetical self-report for
"include one vitamin C rich food daily"
with pre-post and post-then-pre
Pre score= Often (4)
Post score= Often (4)
Post score= Often (4)
Retrospective pre score (then)= Seldom (2)
Table 2. Sample practices included on self-report post-then-pre instrument.
|1 = almost never, 2 = seldom,
3 = about half the time, 4 = often, 5 = almost always
|End of ETHT
|Set goals for changing my health behavior||1 2 3 4 5||1 2 3 4 5|
|Use information from nutrition labels||1 2 3 4 5||1 2 3 4 5|
|Leave extra food on plate when it's more
than I need
|1 2 3 4 5||1 2 3 4 5|
|Include one vitamin C rich food daily||1 2 3 4 5||1 2 3 4 5|
|Limit egg yolks to three or less per week||1 2 3 4 5||1 2 3 4 5|
Figure 1. Sample of report for post-then-pre data.
Using the Results
The Extension agents used the results in three ways. First, they reviewed changes participants made and associated these changes with the course content and teaching methods. If participants made no change in certain behaviors, agents questioned whether they needed to alter their teaching method or amount of emphasis placed on the topic. Second, agents used the data indicating behavior change to support course impact. The impact findings provided accountability to local Extension boards and councils as well as to the Extension organization. Third, the results were shared with local residents through newspaper releases. And, they provided good testimonials as agents promoted new classes.
The nutrition specialist and other state Extension staff used the statewide summary to report course impact to a funding agency as well as to ES-USDA. A summary of the findings have also been used by administrative staff with legislators, boards and councils, and other decision makers.
Using a post-then-pre design to identify self-reported behavioral changes can provide substantial evidence for program impact. Although a nutrition example was provided here, the methodology can be adapted and easily applied to other Extension programs. Theoretically, Extension specialists and agents develop programs from a set of behavioral objectives. The challenge in constructing a post-then-pre evaluation instrument is to identify specific behaviors that may change and then develop an appropriate measurement scale that tests the amount of self-perceived behavior change. Clientele can easily complete a post-then-pre instrument in a relatively short time frame. Computers are easily accessed for accurate data analysis. Results can be effectively reported simply without taking a great deal of staff time. Using a post-then-pre evaluation design greatly helps specialists and agents document how Extension programs effect change in people's lives.
1. Robert L. Linn and Jeffrey A. Slinde, "The Determination of the Significance of Change Between Pre- and Posttesting Periods," Review of Educational Research, XLVII (Winter 1977), 121-50 and George S. Howard and others, "Internal Invalidity in Pretest-Posttest Self-Report Evaluations and a Reevaluation of Retrospective Pretests," Applied Psychological Measurement, III (Winter 1979), 1-23.
2. George S. Howard and Patrick R. Dailey, "Response-Shift Bias: A Source of Contamination of Self-Report Measures," Journal of Applied Psychology, LXVI (No. 2, 1979), 144-50; Ellen R. Benjamin, "Using the 'Post-Then' Method of Evaluation," Training (November 1982), p. 72 and Robert C. Preziosi and Lesile M. Legg, "Add 'Then' Testing To Prove Training's Effectiveness," Training (May 1983), pp. 48-49.
3. SAS Institute Inc., SAS User's Guide: Basics, Version 5 Edition (Cary, North Carolina: SAS Institute Inc., 1985).