August 2017 // Volume 55 // Number 4 // Tools of the Trade // v55-4tt4
An Incremental Approach to Improving Evaluation Response Rates for Multiday Events
Online survey systems have improved evaluation costs and efficiencies but tend to result in lower response rates. We developed an incremental approach for online evaluation of multiday events. The incremental approach splits a complete evaluation into smaller sections and provides respondents access to both current and past sections. We selected two annual events (one national and one state-level) at which to evaluate the approach. Overall, evaluation response rates with the incremental approach averaged 45.9%, nearly double the 25.3% response rate that is typical with traditional online evaluations. Use of this incremental approach for online evaluation resulted in improved assessment of the respective events, suggesting its usefulness for future event evaluations.
Introduction and Background
Participant evaluation is critical to the success of any Extension event. Use of web-based (online) evaluations has increased in popularity over the past several years. There are several benefits to this style of evaluation, including low cost and ease of implementation (Archer, 2008; Monroe & Adams, 2012). However, online evaluations tend to have lower survey response rates than traditional mail or phone evaluations do (Dillman, Smyth, & Christian, 2009; Nulty, 2008; Petchenik & Watermolen, 2011; Tobin, Thomson, Radhakrishna, & LaBorde, 2012). Researchers have explored several methods for increasing response rates, such as using personalized and repeated contacts with participants (Dillman et al., 2009; Nulty, 2008).
One of the biggest challenges in using online evaluations is designing a comprehensive evaluation that also demands minimal time investment by the respondent (Dillman et al., 2009). This task can be especially difficult for multiday events because of the diverse activities requiring appraisal. We developed an incremental approach for multiday events that allows for soliciting feedback on event activities as they transpire. The incremental approach encourages event participants to provide responses about activities as they happen, while impressions are fresh in their memories. The incremental approach involves two key aspects: (a) development of small, manageable sections of an evaluation to which respondents can respond quickly and (b) development of a reminder method that engages respondents repeatedly during (and shortly after) the event, underscoring the importance of participation in the evaluation.
Materials and Methods
The incremental approach was implemented at two multiday events: the National Association of County Agricultural Agents (NACAA) Annual Meeting and Professional Improvement Conference (AM/PIC) and Penn State Extension's annual Dairy Nutrition Workshop (DNW). The incremental approach was used at the NACAA AM/PIC in 2013 only. In contrast, the approach was first implemented at the DNW in 2013 and was continued in subsequent years. For both events, results from several prior years of traditional online evaluations were available for comparison. Event evaluators used SurveyMonkey online system for all evaluations. The evaluation design, collection setup, and implementation involved the following key factors:
- Evaluation design
- Questions were organized in sections, with each section centering on a common theme (typically activities or sessions for a given day). In addition, the first section included pre-event questions, and the last contained event summary questions.
- The first page of the survey consisted of one question that allowed respondents to select the section they wished to complete.
- A page at the end of each section informed respondents that they had completed the most recent section but had not completed the entire evaluation.
- Page logic inherent in SurveyMonkey allowed event evaluators to control access to different sections of the evaluation.
- Once multiple sections were available, respondents were redirected to the first page (the section selection page) when they accessed any section other than the most recent.
- Respondent information was loaded to a SurveyMonkey collector specific to the evaluation. The collector is a feature that controls how the evaluation is accessed, such as via individual email (unique URL), web link, or so on. With the individual email option, each respondent receives via email a unique URL where his or her responses are recorded, allowing respondents to return and monitor their progress. For the NACAA AM/PIC and DNW events, the collector was set to allow respondents to return to the evaluation at any time.
- Respondents opting out of SurveyMonkey direct emails received a separate general web link collector.
- After configuration of the collectors, the event evaluators set the page logic and sections for the first part of evaluation.
- Respondents received an email containing a link to the first section of the evaluation and encouraging them to respond before release of the next section.
- The following steps preceded the release of each subsequent section:
- The new section was "unhidden" so that respondents could access it.
- Page logic for the last page of the previous section redirected the respondent to the page for selecting the new section, and the last page of the new section redirected to an "end of section" page.
- Potential respondents received one of two email messages from SurveyMonkey. Those who had started the evaluation received a message informing them that the next section was available. Nonrespondents received a message encouraging them to start the evaluation and informing them that the next section was available.
We analyzed response rates for both traditional online and incremental approach evaluations using PROC UNIVARIATE and PROC NPAR1WAY procedures of SAS/STAT software, Version 9.4 for Windows (SAS Institute, Cary, NC).
Table 1 provides summary data about number of participants, number of responses, and response rates for event evaluation relative to application of a traditional evaluation approach versus the incremental approach. Event evaluators observed a slight increase in overall participant numbers for event evaluations that implemented the incremental approach, but this factor did not affect the response rate. When traditional online approaches were used, response rates averaged 25.3%; when the incremental approach was used, an average response rate of 45.9% was achieved. A Wilcoxon signed-rank test indicated that response rate ranks from online evaluations for which the incremental approach was used were higher than ranks for traditional online evaluation response rates (Z = 2.50, p < .0126). Use of the incremental approach increased response rate consistently for both events. Due to the limited number of evaluations, we did not complete a separate analysis by event type.
|Method||No. of events||Average number of event participants (SD)||Average number of respondents (SD)||Average event response rate (SD)|
|Traditional online approach||14||497 (±147)||135 (±94)||25.3% (±11%)|
|Incremental approach||4||584 (±5)||268 (±90)||45.9% (±15%)a|
|aSignificantly different scores (Z = 2.50, p < .0126).|
Online event evaluations have become widely used due to potential cost savings and ease of development and implementation. The incremental approach described in this article consistently resulted in increased response rates for multiday event evaluations.
Some limitations to using the approach include possible lack of participant access to/use of email, limitations of the chosen online evaluation software, and lack of alignment between the main objectives of the event evaluation and results from applying the incremental approach. However, this approach illustrates the potential of using advancements in online evaluation software to increase numbers and frequencies of interactions with event participants, thereby obtaining timelier, more complete feedback.
Archer, T. M. (2008). Response rates to expect from web-based surveys and what to do about it. Journal of Extension, 46(3), Article 3RIB3. Available at: http://www.joe.org/joe/2008june/rb3.php
Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Internet, mail, and mixed-mode surveys: The tailored design method (3rd ed.). Hoboken, NJ: John Wiley & Sons.
Monroe, M. C., & Adams, D. C. (2012). Increasing response rates to web-based surveys. Journal of Extension, 50(6), Article 6TOT7. Available at: http://www.joe.org/joe/2012december/tt7.php
Nulty, D. D. (2008). The adequacy of response rates to online and paper surveys: What can be done? Assessment & Evaluation in Higher Education, 33(3), 301–314.
Petchenik, J., & Watermolen, D. J. (2011). A cautionary note on using the Internet to survey recent hunter education graduates. Human Dimensions of Wildlife, 16(3), 216–218.
Tobin, D., Thomson, J., Radhakrishna, R., & LaBorde, L. (2012). Mixed-mode surveys: A strategy to reduce costs and enhance response rates. Journal of Extension, 50(6), Article 6TOT8. Available at: http://www.joe.org/joe/2012december/tt8.php