February 1996 // Volume 34 // Number 1 // Feature Articles // 1FEA1

Previous Article Issue Contents Previous Article

New National Program Information System for Cooperative Extension: Lessons from Experience

National Extension program information systems focused on Federal needs have been plagued by severe data acquisition problems. These problems may re-occur in a new national program information system which is to include Extension. This new system is being developed in response to the Government Performance and Results Act. To overcome potential data acquisition problems, state and Federal program staffs must collaborate in building mutually relevant, valid, complete, databases for national Extension programs. Under existing institutional arrangements, program staffs in states will engage in such collaboration to the extent that indicator data for the national program information system are useful for state and local program planning, budgeting, management, and marketing. Nine steps are recommended to improve program effectiveness, management of program delivery, and program accountability, which are purposes of the Government Performance and Results Act.

Claude Bennett
Program Evaluation Leader
Plant and Animal Production, Protection, and Processing
Cooperative State Research, Education, and Extension Service
Washington, District of Columbia
Internet address: cbennett@reeusda.gov

The Cooperative State Research, Education, and Extension Service (CSREES) and its land-grant university partners are developing a new, national program information system that will cover programs of the Cooperative Extension System. The Government Performance and Results Act (GPRA), designed to improve both the management and the accountability of programs supported by Federal funding, provides the mandate, direction, and overall framework for the new program information system. To be successful, GPRA implementation must be consistent with Cooperative Extension's decentralized, intergovernmental partnership and programming structure.

Given GPRA's Federal origin, there is risk that the design of the new program information system will under-emphasize the program information needs of the state and local Extension partners of CSREES. If this happens, there will likely be repetition of major problems encountered by CSREES's current (1992-1997) Program Planning and Reporting System (PPARS). PPARS was designed to focus on Extension program information needs of CSREES without attending to related program information needs of state and local Extension partners.

National Extension program information systems that have focused only on Federal information needs, including PPARS, have been plagued by severe data acquisition problems. Problems in acquiring valid and complete program indicator data are likely to re-occur in the new, GPRA-influenced information system unless designers understand and apply lessons from past and current experiences with Extension national program reporting systems. This article is to stimulate consideration of these experiences toward ensuring acquisition of adequate program indicator data by the new, national program information system.

Lessons from Experience

The Extension Management Information System (EMIS) was launched in 1970. It focused on national quantitative indicators of staff time, program activities, and clientele participation. EMIS was discontinued in 1981, due to widespread state Extension staff resistance to the reporting burden; lack of usefulness of detailed national statistics on Extension program efforts and clientele participation; and lack of data on program results.

Reacting to the limitations of EMIS, Extension's 1982-1991 Narrative Accomplishments Reporting System (NARS) de-emphasized national quantitative indicators. Instead, NARS encouraged state Extension program staffs to report their respective program plans and attendant outcomes according to state perspectives. NARS reports were intended for use at both the state and Federal levels. NARS provided many valid and helpful anecdotal reports about state Extension programs and their results; these were used in numerous U.S. Department of Agriculture reports to exemplify Cooperative Extension programs and associated results.

But, Extension national program leaders found the largely narrative, state-oriented NARS program reports difficult to combine into national or multi-state generalizations about the scope and effectiveness of Extension programs. Retrievals from the NARS database were time consuming. The state program data, self-selected by individual states, could not be aggregated; and synthesizing state reports into national reports was extremely demanding because of lack of consistency among state reports.

Hence, PPARS was designed to aggregate--for national use--state Extension program indicator data focused on federally-selected quantitative indicators. PPARS promised to efficiently provide useful quantitative generalizations about the scope and performance of nationally targeted Extension programs (i.e., state Extension programs that fit into the same federally- defined program category). But PPARS has provided a questionable number of such generalizations, due to the generally poor quality of indicator data in most of its databases representing nationally targeted programs. Many state Extension programs are represented by only partially complete data for only some of the national indicators that are relevant to these programs. State narrative data that are submitted to PPARS tend to be illuminating, but these submissions are often limited in depth, scope, and consistency. The quality of the databases for most of the nationally targeted programs is so low that they are unusable for preparing credible, quantitative, national program reports.

For most of the nationally targeted programs, there is a general lack of state and CSREES processes to assure the quality of PPARS indicator data; and so, quantitative data in these PPARS national databases include many discrepancies. Consequently, different states and regions show clearly incredible variations in reported quantitative program results within these national Extension targeted programs. (For example, two different regions in the U.S. report expending similar amounts of resources to conduct programs within the same nationally targeted program; but these regions also report up to a 100-fold difference in results of their respective programs according to the same national indicators. Likewise, two individual states report expending similar amounts of resources to conduct programs within the same nationally targeted program; but these states also report up to a 1,000-fold difference in results of their respective programs according to the same national indicators). The kinds and magnitudes of voids and invalidity in much of the quantitative indicator data of PPARS were also found in the data for the 1994-1995 Extension pilot test of GPRA implementation, and were a major reason for early termination of the pilot test.

Diverse factors are collectively responsible for the frequently low quality of indicator data in PPARS. However, many of the factors can be traced to limited collaboration among state and CSREES counterparts of the same nationally targeted Extension programs, regarding indicator data for those national program categories. Such collaboration requires continuity of dialogue to agree on intent and implementation of (a) collection, quality control, and analysis of indicator data, as well as (b) data utilization for improving program effectiveness and accountability. The needed collaboration would motivate both state and CSREES program staffs to ensure that program planning and performance data are of adequate quality for use at both the state and Federal levels.

Challenges for the Partnership

GPRA requires CSREES, as a Federal Agency, to submit a strategic plan, annual program plans, and annual program performance reports. Annual plans should contain goals for each program set forth in the annual budget of CSREES. Extension programs set forth as specific budget lines in the CSREES annual budget include "Children and Youth at Risk," "Water Quality," "Integrated Pest Management," and "Rural Health and Safety Education." (For purposes of facilitating compliance with its requirements, GPRA permits related budget line programs to be combined into program clusters).

GPRA requires that each federally budgeted program or program cluster have goals for immediate outputs and longer-range outcomes. Such goals are required to have quantitative indicators to gauge the performance of the program in achieving its targeted outputs and outcomes. Under GPRA, program performance indicator data used by CSREES will be subject to validity checks by external auditors representing Federal monitoring agencies.

Program performance indicators in CSREES's new information system may focus largely or exclusively on Federal needs with little or no assurance of use of these indicators in state Extension programming and marketing. If this happens, then Extension will again have--as in EMIS and PPARS--a national system for reporting program plans and associated results that has little or no direct potential use in states. In this event, state program staffs will understandably not see whether, or how, collection and use of data for national indicators directly reflects, guides, or benefits their respective state program. Thus, these staff will have limited reason to meet the challenge of supplying complete and valid data for national program indicators.

Extension staffs in states generally focus on program indicators that are relevant to program planning, budgeting, management, evaluation, and marketing at the state, area, and/or county levels. Extension's cooperative (non-line), intergovernmental character and routine funding mechanisms limit the institutional incentives provided to state program staff for reporting quality-controlled performance data to CSREES. State Extension staff have increasingly heavy work loads and concomitant need to set priorities. Therefore, is it realistic to expect state staffs to respond with the significant effort and leadership necessary to annually provide valid and complete program indicator data that are to be used only or primarily for Federal purposes? Experience generally does not support a positive answer to this question.

Experience of CSREES staff with PPARS' federally focused program indicators shows there is, currently, only one condition that assures state Extension program coordinators will supply valid and complete data for national program indicators. The condition is this: state program coordinators receive, from their respective CSREES counterparts (a) repeated requests (based on quality checks) for immediate, systematic upgrading of the quality of submitted indicator data, along with (b) considerable amounts of sustained support, guidance, and encouragement toward resubmitting the requested data of upgraded quality. Significant CSREES staff time commitments are needed for continuity of these requests, support, guidance, and encouragement. These efforts are being expended by the CSREES staff of a few nationally targeted Extension programs, but the magnitude of these efforts appear to be unsustainable by CSREES program staff as a whole.

Program indicators (or sub-indicators for constructing indexes or scales) must be relevant and useful at the state, area, and county levels. Under these conditions CSREES, state, and local cooperators can have a collaborative and jointly-owned program information system that produces valid and complete program performance indicator data.

The key to establishing a meaningful national GPRA system for Extension is to construct it to be useful also to program and administrative staffs in states--for program planning, budgeting and managing, as well as program improvement and marketing. This will: (a) avoid state, area, and county staff perceptions of the new, national program information system as just a burden and exercise required by the Federal partner; and (b) promote collaborative, effective efforts by state and CSREES program staffs to acquire and use valid and complete program indicator data.

State program staff motivation to obtain and report valid and complete program indicator data must be the number one priority in constructing a workable, national program information system. Given existing institutional arrangements within and between county, state, and Federal partners of Extension, this motivation can only be achieved through joint state/CSREES program staff ownership of national databases. Resources needed to obtain, analyze, communicate, and use relevant, valid and complete performance indicator data will follow from program staff motivation to market and improve their respective programs. Increasingly competitive funding environments reflected by downsizing and privatization of public sector programs will augment this motivation.


The following are nine recommended steps for collaboration among CSREES, state, and local program staffs to successfully complete GPRA annual plans and performance reports. These steps would obtain and use management and accountability indicator data regarding the targets, implementation, and associated results of each CSREES budget line program or program cluster. The steps (a) are based upon the observations, lessons, and precepts cited above, and (b) draw upon experience of and innovations by CSREES and collaborating state staffs in Extension's national water quality initiative. Of course, pilot testing would be necessary prior to any agency-wide adoption of these recommendations.

The first six steps for a budget line program or program cluster lead to submission of a proposal by a national committee of one or more CSREES program leaders and representatives of their state and local counterparts. The proposal would be to obtain and use program indicator data. The final three steps focus on state and CSREES approval and implementation of the proposal, followed by evaluation of both the resultant program information and the methodologies used to acquire and use it.

  1. Negotiate definition of a program or program cluster (relative to one or more CSREES budget lines) that would best assist program planning, management, budgeting, evaluation, and accountability at the state level and at the CSREES level.

  2. Delineate program goals, strategies, activities, and resources, as well as performance indicators and sub-indicators. These indicators and sub-indicators would be selected by state and CSREES counterpart program staffs as useful to their respective program planning, budgeting, management, and evaluation for program marketing and improvement. Commonalities or convergence of programs and associated information needs in individual states would be a basis for selecting national program indicators and sub-indicators. (These would be numerous enough to avoid combinations of "oranges and coal" that are meaningless for program management, accountability, and/or marketing).

  3. Recommend that the proposed program indicators and sub-indicators--and the resources needed to collect, assure quality, and analyze the indicator data as well as use the resultant program information--be approved by Extension administrators at the state and CSREES levels.

  4. Suggest specific uses of the anticipated program reports that are to be based on analyses of (a) the national program database, and (b) related state program databases. Such specific uses would include ways to improve program effectiveness, management, accountability, and marketing. State and CSREES program staffs would complete, distribute, and use program reports for program marketing to state and national decision makers.

  5. Recommend, for both the national and state levels, the organizational and technical procedures, software and hardware, training, and support budgets needed to ensure that CSREES and state program staffs successfully obtain, quality check, transmit, analyze, and use the agreed upon program indicator data.

  6. Recommend CSREES partnership with a selected state institution to: (a) help assure adequate quality of state Extension program indicator data in the national program database, and (b) perform national database analysis and report preparation, in consultation with a national panel representing state and county staffs having vested interests in reflecting the nature of their programming and its degree of effectiveness.

  7. Gain administrative approval of the overall proposal to acquire national and state program information (Steps 1-6), including commitment of requested resources. Then, implement the approved program information proposal and distribute the ensuing state and national reports according to strategy.

  8. Utilize the above reports to improve Extension program effectiveness and accountability (as intended by GPRA) through multiple communication channels and deliberations at county, state, and national levels.

  9. Share and utilize suggestions by state and CSREES staffs for improvements in program information and the methodologies for its acquisition and use.

CSREES staff--through cooperative agreements with one or more state institutions and guidance from the overall national GPRA Council for CSREES--would summarize the array of completed national program reports into brief summaries of program plans and performance reports for submission to USDA's Chief Financial Officer, as required by GPRA.

Conclusions and Implications

Collaboration among CSREES, state, and local staffs can successfully produce and utilize state and national reports of Cooperative Extension program plans and associated results. It is critical that state, local, and CSREES program staffs have joint ownership of the new, national program information system.

In the United States, there is a strong consensus that providing education is primarily a state and local responsibility. Therefore, performance indicators for Extension budget line programs of CSREES should summarize those indicators that state and county staffs of the respective programs select and use most frequently to improve their program management, delivery, and accountability. This view places responsibility on each state Extension Service to ensure that its program staff uses robust program indicators for their own programming and reporting purposes. CSREES is collaborating with selected state Extension services to examine needs and potential ways for state and local Extension staffs to strengthen their use of program indicators in water quality programming and marketing.

This article is intended to provide food for thought toward obtaining adequate indicator data that can be used to improve operational program effectiveness, delivery, and accountability as intended by GPRA. It is imperative that the Cooperative Extension System maximize the usefulness of indicator data regarding its programs.