The value of evaluation: a grantmaker’s perspective

By Jeff Pryor, Ed.D., Anschutz Family Foundation Executive Director
Alexandra Mitchell, M.P.A.
Whitney Johnson, M.En

As grantmakers grappled with the development of the Colorado Common Grant Report, there was considerable discussion about the purpose of the report. Would it simple be a summary of the nonprofit’s work in relation to the grant? Or should it include discussion of the organization’s efforts in evaluation and its endeavor to become what Peter Senge calls a “learning organization?” In the end, it was decided to have the Common Grant Report be just that—a report.

So why evaluate, why report? Of course, there are accountability justifications and people generally recognize the value of transparency (unless you’re Bernie Madoff). Yet observation suggests that evaluation is often seen as cumbersome; a requirement of someone else, someplace else, and extraneous to the real work of the organization. The bet is that most nonprofits are so dedicated to mission and to results, that they are purposeful and confident in the evaluative process. Further, evaluation can illicit the same panic response as speaking in public or jumping off a high bridge into deep water. A typical grantmaker fear is to hear, “Sorry, what do you want us to measure?”Can a nonprofit gain competence and confidence in the process of evaluation to the point that the organization clearly defines priorities and actions? Not having confidence about evaluation is a capitulation—it puts someone else in charge. Now imagine an organization that has devoted a reasonable amount of energy to the evaluative process. It should have asked many questions, defined parameters, sought third-party verification, checked out the work in the field, and conducted a literature review. It sounds daunting, until one realizes that it’s the same process that most people use to pass a driver’s license test. We evaluate all the time—a movie, a restaurant, a singer on American Idol—and we often use both subjective and objective measures, both fact and intuition.

Can the evaluation process be implemented so that it is both integrated and ingrained within the organizational culture? We know that staff and stakeholders will use both the process and the findings when they understand and are empowered by evaluation. But it’s important to think along the old adage, “It’s the journey, not the destination.” The idea is to strengthen projects, to understand why a project is, or is not, meeting goals, and to have a flexible, multidisciplinary approach, which is inclusive of all voices and positions within and outside the organization. Evaluation provides a collective understanding and can be an incredibly powerful tool for identifying strengths, weaknesses, and opportunities along every step of the life cycle of a program or project.

The bumper sticker, “Don’t always believe what you think” represents one of the major challenges to the field of evaluation. Our opinions, available information, biases and approaches to the collection of information (a.k.a. “data”) can all serve to give us just enough information on which to draw conclusions—conclusions that might be correct, or might be totally off base! Understandably then, many have been dubious about the corruptibility of both unsubstantiated research methods and about efforts designed to “prove” something. Some say being detached, objective and measured is the only way that “true” research can be conducted. But then, it’s too complicated, so why bother? The trouble is people often are invested in and seriously care about the results.

“Hard science” folks can be suspect of the “soft science” genre, yet all fields are beginning to recognize that good research and evaluation, no matter in what discipline or what field of endeavor, is based upon the best combination of secondary and primary sources of information, qualitative and quantitative approaches, proper methodology, objectivity and integrity. Inquiry, testing, analysis, and results thus serve to guide us:
➢    “We have both objective and subjective information that leads us to believe that the course we have chosen is the right one.”
➢    “We will use an approach that was created as a result of our own analysis and what the evidence suggests.”
➢    We will pay attention to a number of indicators and, to the best of our ability, use the information we collect in the most appropriate way possible.”
➢    “We will monitor our progress towards a goal and allow ourselves to adjust our directions and efforts as we learn throughout the process.”

True, evaluation can be costly and time consuming, but there are reasonable approaches that exist and a set of procedures to improve the validity of our research efforts. In essence, if a set of procedures are followed, then the findings should be less contaminated by subjective influence and/or faulty research design. Furthermore, solid evaluative procedures encourage managers and leaders to be more focused on making decisions based on this information gathering and analysis. The purpose is to define what is possible and use good information gathering—just as a farmer does when s/he attempts to increase the odds of crop yields through the observation of weather patterns, time, moisture, intensity of light—make something happen and produce positive results.

Never should the words be spoken by a nonprofit representative, “I’m sorry, say again, what exactly did you want us to measure?” It’s far better to hear, “Let me describe for you why we are a ‘learning organization’ and the steps we take to be deliberate about the evaluative process and how that relates to meeting our mission. We incorporate studies within our field, we pay close attention to indicators, we’ve developed strategic relationships (e.g., with a local college or evaluation consultant), we share information, engage in collaboration to strengthen our efforts, and we invite peer review. We also have a Board committee that reviews our procedures and results, and actively engages our staff and stakeholders. Additionally, we are devoted to planning both for short-term and long-term, and incorporate an inclusive evaluative process that involves all aspects of the organization—our staff, volunteers, stakeholders, facilities, finance, cultural competency, management processes, etc. Now, do you have any other questions?”

And your answer might be, “How can we help you?”

Share

Advertisements
This entry was posted in Community and tagged , , , , , . Bookmark the permalink.

JVA welcomes your comments and feedback on all Nonprofit Street articles. While JVA will post all relevant comments, it will not post comments that are advertising products or services or those with obscenities. Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s