Evaluation To Support Strategic Learning

Download Evaluation To Support Strategic Learning

Preview text

evaluation to support
strategic learning:
principles and practices

Julia Coffman and Tanya Beer

June 2011

©2011 Center for Evaluation Innovation. All rights reserved. May not be reproduced whole or in part without written permission from Center for Evaluation Innovation.

About the Center for Evaluation Innovation
The Center for Evaluation Innovation is a nonprofit effort that is pushing evaluation practice in new directions and into new arenas. Founded in 2009 and based in Washington D.C., the Center specializes in areas that are hard to measure and where fresh thinking and new approaches to evaluation are required. This includes, for example, the evaluation of advocacy and policy change, communications, and systems change efforts. The Center partners with others to develop and share new ideas and solutions to evaluation challenges through research, communications, training development, and convening.
About Evaluation for Strategic Learning
Evaluation that supports strategic learning is an area that the Center for Evaluation Innovation is helping to develop and grow. Working with organizations and groups to integrate evaluative thinking into their strategic decision making and bring timely data to the table for reflection and use has tremendous potential as an approach to evaluation, particularly for complex and dynamic social change strategies.
This brief defines the concept of evaluation for strategic learning and the principles that underlie it. Its aim is to increase awareness about, and use of, this approach. The Center is supporting the development of further resources and tools that evaluators, nonprofits, and funders can use to learn about and apply strategic learning.

Julia Coffman and Tanya Beer

Strategic learning is the use of data and insights from a variety of information-gathering approaches—including evaluation—to inform decision making about strategy. Strategic learning occurs when organizations or groups integrate data and evaluative thinking into their work, and then adapt their strategies in response to what they learn. Strategic learning makes intelligence gathering and evaluation a part of a strategy’s development and implementation—embedding them so that they influence the process.
Foundations increasingly are adopting the concept of strategic learning to describe their efforts to incorporate reliable data and ongoing reflection into their social change strategies. This concept is particularly important for those strategic philanthropists who recognize that difficult problems often require transformative solutions, and who, in response, are adopting dynamic approaches to social change that think big and aim high. Their strategies are often complicated—i.e., with multiple causal paths or ways of achieving outcomes—and complex—i.e., emergent, with specific goals and activities that develop while the strategy is being implemented.1
Foundations that take on these strategies must be willing to live with uncertainty and acknowledge that their plans, no matter how well laid out, will likely shift as the circumstances around them evolve. Strategic learning promises that the lessons that emerge from evaluation and other data sources will be timely, actionable, and forward-looking, and that strategists will gain insights that will help them to make their next move in a way that increases their likelihood of success.
This brief digs deeper into the concept of strategic learning and explores the characteristics of how evaluation, in particular, can support it. It proposes a set of principles that represent the “non-negotiables” of evaluation for strategic learning. The brief concludes with thoughts on what lies ahead for this emerging field of practice.

1 Rogers, P. (2005). Evaluating complicated—and complex—programs using theory of change. The Evaluation Exchange, 11, 13.

Evaluation for Strategic Learning: Principles and Practices



What Does Strategic Learning Require?
A s defined above, strategic learning integrates information from a wide variety of intelligence-gathering activities, including evaluation, into decisions about strategy. This seems straightforward enough, yet foundations have puzzled over how to put strategic learning into practice. Perhaps this is because once this deceptively simple definition is unpacked, three significant questions emerge:

1. What do we mean by “strategic” decisions? More specifically, what counts as strategy in the philanthropic and nonprofit sectors, and how does strategy get created, executed, and adapted?
2. What kinds of data and intelligence are meaningful for informing strategic decisions?
3. What does it take to “integrate” evaluation into strategic decisions? That is, how do organizational and group culture, structures, and processes support or hinder reflection and the use of data?

Putting strategic learning into practice requires attention to all three of these questions. Approaches that attend to only one or two of them can fall short of producing observable changes in strategy. For example, a sole focus on the third question— facilitating organizational structures that support

Strategic learning integrates information from a wide variety of intelligence-gathering activities, including evaluation, into decisions about strategy. Although

learning (better grantee reporting, cross-departmental information sharing, etc.)—without discerning and collecting the kinds of data that are relevant for strategy decisions (the second question), can result in an explosion of information that has little meaning or relevance. Conversely, carefully defining

evaluators have a key role to play, so do programmatic staff and leaders who determine when and how strategy is planned, executed, and adjusted.

and collecting relevant data without nurturing the organizational culture and processes that allow groups

to reflect and make meaning of the data can result in the much-lamented “lessons learned” report that never

leads to organizational behavior or strategy changes. And finally, even attention to questions two and three

above without clarity about what exactly is meant by “strategy” and how an organization or group adapts its

strategy over time can lead to ad hoc and unfocused changes in direction that water down potential impact.

The implication here is that strategic learning is not just about evaluation and how evaluation is positioned. Although evaluators have a key role to play, so do programmatic staff and leaders who determine when and how strategy is planned, executed, and adjusted. Evaluators may answer the questions, but it is these staff and leaders who ask them—framing the questions of what they need to know to be more effective. Also, although the data and lessons produced by evaluation are indispensible to strategic learning, so too are data and feedback gathered through other evaluative activities, such as situation analyses, needs assessments, and systematic reflection and insights from peers, partners, or personal experiences. Even while the concept of strategic learning focuses on a very specific purpose—informing strategy decisions—it invites a broadening of ideas about the role that evaluation and other information sources can play. For this brief, however, the focus is on evaluation’s role in this equation.

Evaluation for Strategic Learning: Principles and Practices



How is Evaluation for Strategic Learning Different?

E valuation is conducted for a wide variety of purposes or uses. These intended uses help to determine which evaluation approach is appropriate. For example, foundations often commission retrospective evaluations to judge an effort’s overall merit or worth so they can decide whether to continue or discontinue it, or scale it up in other sites.2 Other evaluation approaches are used to improve a program and ultimately distill it into a “recipe” or model that can be tested later for impact. Still others aim for accountability, or ensuring that efforts are doing what they said they would do and that resources are being managed well.

Evaluation that supports strategic learning has a different purpose. Designing data collection and evaluation specifically to support strategy decisions

Designing data collection and evaluation specifically to support strategy decisions requires shifts

requires shifts in thinking about what questions get asked, the role the evaluator plays, how data collection is timed, and the framing of the findings.

in thinking about what questions get asked, the role the evaluator plays, how data collection is

While decisions about program continuation or

timed, and the framing of findings.

replication can be strategic, they are only so in the

context of larger prospective strategic questions, such as: If this approach did not have the impact we

hoped, does it mean that our strategic positioning (what we are trying to affect) or our strategic perspective

(how we are trying to affect it) should be changed, or are there tactical missteps that need to be fixed?3

These kinds of strategic questions can rely in part on the information garnered from an evaluation that

judged an intervention’s worth or merit, but they also need to be answered with additional intelligence-

gathering that goes beyond the effects of a particular intervention.

Likewise, while evaluation for strategic learning certainly aims to help strategies improve or move in a positive direction, in reality the “right” direction is not always known. Evaluation for strategic learning does not necessarily make judgments that what was done before was ineffective. Strategic learning helps strategies adapt based on information that is known or that can be collected at the time. A program can be refined so that it works better and its essential ingredients are understood, but that does not necessarily help strategic decision makers determine whether it remains the right program for them to support, whether they are well positioned to grow the program, whether they can anticipate that the environment will remain favorable for the program given other changes, or whether the program fits with their larger strategy. Evaluation for strategic learning pays attention to those broader strategy questions.

2 Scriven, M. (1991). Evaluation thesaurus. 4th edition. Newbury Park, CA: Sage.
3 See the work of management scholar Henry Mintzberg, who identifies position and perspective as two primary dimensions of strategic change, e.g., Mintzberg, H. (2007). Tracking strategies. New York: Oxford University Press.

Evaluation for Strategic Learning: Principles and Practices



New kinds of evaluation have emerged to support strategic learning in recent years, including developmental evaluation (coined by Michael Quinn Patton) which features longer-term partnering engagements with external evaluators. Strategic learning can also be supported by shorter-term efforts that feature data collection at any stage of strategy development, implementation, or review. What is most important—regardless of how long an evaluator is engaged to support strategic learning, and regardless of whether it is an internal or external evaluator who plays this role—is that the purpose of supporting strategic learning remains front and center in the evaluation’s design.

What are the Principles of Evaluation for Strategic Learning?
Strategic Learning Principles
1. Evaluation is a support for strategy. 2. Evaluation is integrated and conducted in partnership. 3. Evaluation emphasizes context. 4. Evaluation is client-focused. 5. Evaluation places a high value on use, and helps to support it. 6. Evaluation data to inform strategy can come from a wide variety of sources and methods. 7. Evaluation must take place within a culture that encourages risk taking, learning, and adaptation. 8. Evaluation is flexible and timely, and ready for the unexpected. 9. Evaluation is constructivist.
S o far, the general concepts of strategic learning and evaluation’s role in supporting it have been defined. To get more specific about what this approach to evaluation involves, the principles that underlie and characterize it must be explored. While there is no formula for how evaluation for strategic learning must look in practice, these basic principles further define what it entails.
Evaluation is a support for strategy. First and foremost, evaluation must be seen and positioned as a key support for strategy development and management; it should have a seat at the strategy table. Traditionally, evaluation is not viewed in this way. It is considered a separate component, usually entering after a strategy already has been developed or implemented. An emphasis on strategic learning fundamentally changes evaluation’s role and positioning.
Evaluation is integrated and conducted in partnership. Experience shows that even when evaluation is designed to support learning, the evaluator’s traditional role as an outside observer undermines the evaluator’s ability to sense strategic questions that data collection or facilitated reflection might answer. With a strategic learning approach, evaluators are embedded and use a collaborative and participatory evaluation process. This approach is different from traditional evaluation approaches in which the evaluator remains deliberately separate. In the words of Michael Quinn Patton:
Evaluators become part of a team whose members collaborate to conceptualize, design and test new approaches in a long-term, ongoing process of continuous improvement, adaptation, and intentional

Evaluation for Strategic Learning: Principles and Practices



change. The evaluator’s primary function in the team is to elucidate team discussions with evaluative questions, data and logic, and to facilitate data-based assessments and decision making in the unfolding and developmental processes of innovation. 4

Whether conducting a developmental evaluation or another kind of evaluation or evaluative activity in support of strategic learning, this learning partner or critical friend role helps evaluators stay on top of potential strategy shifts and allows them to facilitate reflection and feedback. This type of relationship positions evaluators to be maximally informed and useful. It also has implications for organizational and team structure and for the way external evaluation contracts are written.

Evaluators who subscribe to a strategic learning approach also typically take on the role of evaluation capacity builders. This means institutionalizing evaluation processes to ensure that the use of learning and reflection to inform strategy is a practice that can be sustained over time, even if the evaluator eventually exits the effort.

Evaluation emphasizes context. Many evaluations merely acknowledge that context or contextual variables exist (consider the big empty box labeled “context” that sits at the bottom of many logic models). For example, summative evaluations—or retrospective assessments of a program’s merit

Evaluation for strategic learning is necessarily utilization-focused, and is therefore always designed with unfailing commitment to actionable data.

or worth—seek to “control for context,” treating

context as a set of mitigating factors from which the true impact of an intervention must be isolated. But

evaluation for strategic learning pays real attention to context, factoring it into and even focusing on it

during data collection, analysis, and interpretation. It assumes that strategic decisions are never made

outside of a context, so “controlling for context” only occasionally returns useful information for strategy.

This means that evaluative inquiry for strategic learning isn’t only answering questions about the client’s

actions. It may be answering questions about the shifting environment, other drivers in the system, and

leverage points beyond the reach of the organizations at the table.

Evaluation is client-focused. Although self-explanatory, this principle bears repeating. Not all evaluation is focused on client needs. When evaluation is synonymous with applied research or has “informing the field” as its primary goal, it often is not client-focused. In addition, funders and nonprofits have different strategic questions, so evaluation that supports the strategic learning for one set of users might not meet the needs of another. It is critical to establish clarity and transparency about exactly whose strategic questions will be answered.

Evaluation places a high value on use, and helps to support it. While the quality of data collection and design is crucial, the ultimate criterion for the success of a strategic learning evaluation approach is the extent to which the client uses the information generated through data collection and reflection to answer

4 Patton, M. Q. (2006). Evaluation for the way we work. The Nonprofit Quarterly, 28-33.

Evaluation for Strategic Learning: Principles and Practices



strategy-related questions. Evaluation for strategic learning is necessarily utilization-focused, and is therefore always designed with unfailing commitment to actionable data.
However, evaluators supporting strategic learning cannot assume that learning necessarily happens simply because data are available, even when data are actionable and timely. To help support use, evaluators must frame data in a way that clarifies the data’s connection to strategy and tactics, surfacing the implications of the data for next steps. Evaluators must also build in intentional learning and reflection processes so that the data’s end-users—the strategic decision-makers—become sense-makers, too, interpreting the meaning of data and uncovering the implications together.
Evaluation data to inform strategy can come from a wide variety of sources and methods. It may include “inward-facing” evaluative inquiry (e.g., mining existing experience for insights and challenging the organization’s mental models) and “outward-facing” evaluative inquiry (e.g., understanding the external environment, the field of players, and the key drivers in the system, and evaluating the target audiences’ or systems’ response to social change efforts). While some resist using the name “evaluation” for these kinds of data collection and interpretation approaches, particularly if they are not conducted as part of a predetermined study design, they are all evaluative exercises.
Evaluation must take place within a culture that encourages risk taking, learning, and adaptation. Research on high-impact or high-performance organizations identifies their ability to learn and adapt as a key component of their success.5 Sometimes called adaptive capacity, this means that organizations experiment and innovate, deliberately evaluate and learn from their successes and mistakes, and adapt their strategies accordingly. “For an organization to be more than the sum of its programs, it needs the ability to ask, listen, reflect, and adapt.”6 Because strategic learning is about continuous learning and adaptation, this approach to evaluation is a waste of time for organizations or efforts that lack adaptive capacity. Some organizations and groups simply are not set up to learn; they do not reward risk taking and consistently punish failure. Experience shows that a learning culture must exist for strategic learning to work.
Evaluation is flexible and timely, and ready for the unexpected. With a strategic learning approach, timing is everything. To ensure that evaluation is used, evaluators who aim to support real-time learning and decision making must deliver the right data at the right time. They must be flexible—willing to adjust their data collection plans according to how the environment or strategy is shifting; and timely—providing data and facilitating learning at the right time to inform strategic decisions. Traditional evaluation designs and practice are rarely flexible or timed for on-going decision-making, as evaluators are contracted for a predesigned plan with reports scheduled at regular intervals, regardless of what kinds of data might be most useful at different moments.

5 See, for example, Crutchfield, L., & Grant, H.M. (2008). Forces for good: The six practices of high-impact nonprofits. San Francisco, CA: Jossey-Bass.
6 Letts, C.W., Grossman, A., & Ryan, W.P. (1999). High performance nonprofit organizations. Hoboken, NJ: Wiley. p. 21.

Evaluation for Strategic Learning: Principles and Practices



Evaluators who support strategic learning often must have the capacity to work rapidly—quick to design, implement, and analyze data; and responsively—able to provide useful and trustworthy strategy-level data as needs arise. Not all of the data and intelligence needs of an evolving strategy can be anticipated at the outset of a social change effort; many such needs emerge at a moment’s notice. Fast-moving policy environments, for example, often present unexpected windows of opportunity or quick and unexpected changes in momentum. The time available to identify the need for new data, develop a tool, collect the data, and facilitate its use by the strategy team in time to make a difference may be very short.

Evaluation is constructivist. Constructivism is a view of the world and of knowledge that assumes that “truth” is not an objectively knowable thing but is individually and collectively constructed by the people who experience it. Constructivist evaluators

Evaluators who support strategic learning must be flexible and timely. They also must work rapidly and responsively.

assume that the practice of making meaning out of

data, perceptions, and experience depends on and is

always filtered through the knowers’ own experiences and understanding. Because its purpose is to inform

strategy that is situated in and must respond to a real-world environment, evaluation for strategic learning

must be constructivist. There are no hard and fast “truths” based on data; rather, evaluation for strategic

learning generates knowledge and meaning from a combination of evaluation data, past experiences, and

other forms of information. Sense-making, or facilitating the interpretation of data and its implications for

the strategic choices of the user, is as crucial a skill as the collection and presentation of reliable data.


F oundations like the David and Lucile Packard Foundation, The California Endowment, The Atlantic Philanthropies, and others are now using more of their evaluation resources to support strategic learning, especially as they take on long-term strategies that have big and bold goals. (The Center for Evaluation Innovation is developing a bank of case studies to illustrate the use of evaluation for strategic learning.) As funders adopt more complex strategies to achieve big results, strategic learning’s profile will continue to rise. Momentum will grow simply because strategic learning works well with strategies that adapt over time. Its profile also will rise because while more traditional evaluations will most certainly continue to serve a very important purpose in the social sector, they will not satisfy the sector’s desire for information and data that inform strategy on an ongoing basis.

This approach is not the right choice for all, however. Reasons against using evaluation for strategic learning might be quite practical. It may have time and dollar costs that are beyond what an organization can manage. Further, not all organizations are properly equipped to learn and adapt strategies in response to data and feedback. In that case, strategic learning will surely fail. Reasons not to use it might also be philosophical. If an organization strongly adheres to traditional evaluation approaches and principles because it prioritizes evaluation for the purpose of accountability or believes that it has the most to gain from understanding the impact of its program in retrospect, an evaluation for the purpose of strategic learning will not be the right choice. But an evaluation in support of strategic learning is the right choice

Evaluation for Strategic Learning: Principles and Practices



when practical and philosophical barriers are not in place and when evaluations are needed for strategies “with a high degree of uncertainty and predictability” and “where an innovator or funder wants to ‘put things in motion and see what happens.’” 7
This paper attempts to catalogue the emerging principles and practices of a strategic learning approach to evaluation. What matters most is that those funding and using this approach find it useful. Looking ahead, the successes and failures of social change efforts will continue to pose questions for which strategic learning may hold answers. As such, strategic learning’s practitioners and champions must continue to seize the opportunity to revise approaches, address criticisms, adjust expectations, and identify additional promising practices.

7 Patton (2008), p. 377.

Evaluation for Strategic Learning: Principles and Practices



Preparing to load PDF file. please wait...

0 of 0
Evaluation To Support Strategic Learning