Minimizing Evaluation Costs

There are no two ways about it: Well-designed evaluations cost money. Just how much money depends on the experience and education of your evaluator, the type of evaluation required, and the geographic location of your program. A good rule of thumb is to budget 15% to 20% of available funds for evaluation. Are there ways to cut corners and evaluate for less? Yes, but, unfortunately, low-budget assessments tend to produce results with little validity. How, then, can you save money without compromising the worth of your findings?

Working with an evaluator:

  • Look for a qualified yet inexpensive evaluator. Faculty at colleges and universities are a good source. They might consider making the evaluation a project for one of their classes. They may also have access to students who can act as paid research assistants, but who earn less than other evaluation staff. Be sure, however, that the faculty member will be closely supervising any students who participate in the study.
  • Look for an evaluator who may be able to get independent funding to conduct the evaluation through a separate grant or contract. This is rare but not unheard of. The chief drawback is that you may have to wait—sometimes several months—to see if this can be done before proceeding with the study.
  • Explore other incentives. For example, look for an evaluator who is interested in branching out and trying new things. Sometimes an evaluator will work for less in order to have an opportunity to do research on a new topic.
  • Look for an evaluator who has experience evaluating programs like yours. Again, this will save money because the evaluator is already familiar with instruments, design issues, and other aspects of the study.
  • Use a collaborative model. Having program staff assume some evaluation tasks (like survey monitoring or data entry) reduces costs. If your organization has the capacity, also consider using an in-house evaluator.
  • Ask the evaluator to price components of the evaluation. This will make it easier for you to drop particular elements or make informed decisions about how to spread around the work.
  • Estimate cost before specifying an amount in your funding proposal. If you pick a number more or less out of the air—say $30,000, you are going to get proposals for studies that cost close to that amount. But there may be a chance that you could get the study done for substantially less. Your future funder may be willing to help you estimate appropriate costs. Ask them to look at what they want from the evaluation and provide a ball-park estimate of what it should cost. You can also hire an evaluator just to do a cost estimate.

Spending less on data collection:

  • Start small. Narrowing the scope of your evaluation will save costs without compromising outcomes. One way to do this is by limiting your evaluation to specific target audiences. For example, if your program aims to affect both students and parents, you might study its impact on only one group—then study the other group when you have more resources. Another way to save money is by focusing on intermediate outcomes, since long-term outcomes are usually more difficult to assess. If you are involved in a drug-education program, for example, start out by focusing on whether program participants learn new information about drugs. Then, if you get positive results, you can look at long-term outcomes when more funds become available.
  • Use existing data. While data specialists often have concerns about the quality of some of even the most reputable national data sources, much of these data is better than what a community can collect without a significant expenditure of time, money, and expertise—as long as the data is transferable to your own project. Most national data systems analyze their data and publish data reports (often available for no cost online). Others will supply raw data to users, do custom data runs, and/or make their data available in formats (or on the Web) that allow people to generate reports tailored to their specifications.
  • Pilot test. Save money in the long run by pilot testing—a process of trying out your data collection instruments and procedures with a small target audience. This can reveal critical flaws like ambiguous questions or interviews that take much too much time. Data collection efforts (especially those involving large numbers of people) should always be pilot tested before implementation.
  • Implement with care. When collecting data, consistency is key. Communicate the importance of consistency to everyone involved in your effort to ensure your results are reliable. If your results aren’t—you may need to start the evaluation process all over again.
  • Select data collection methods wisely. Consider the advantages and disadvantages of focus groups, surveys, and interviews. Qualitative methods tend to be less expensive than implementing a survey (depending on the population you are surveying) and can produce useful information.

Keep in mind that investing in evaluation can actually save you time and money over the long haul. With the information you learn from a worthwhile evaluation, you can focus your resources on the most critical problems facing your community and the most effective countermeasures. However, you are much more likely to collect this information if you partner with a knowledgeable evaluator who understands your program and with whom you can work comfortably.

Last Updated: 09/24/2015