A Framework for Decision Making in Social Innovation Labs

Having just wrapped up the Graduation Diploma in Social Innovation at the University of Waterloo, I want to take the opportunity to share some of what I learned and some of the ideas generated through the program. To start, this post introduces a framework for decision making in Social Innovation Labs environments. These Labs are more fully described by Frances Westley here, but in short try to integrate “group psychology and group dynamics; complex adaptive systems theory; design thinking; computer modeling and visualization tools.” Westley notes that the SILs should include a cross scale focus looking at landscape, regime and innovation niches, provide a whole system focus, make full use of research and integrate the best techniques from change and design labs. By getting all of the parts of system together, new approaches and innovations to systemic change can result. My interest is in how technology, and data visualization and simulation in particular, can facilitate this process.

Today I will talk about the overall framework, then in future posts dive into the details and talk through some examples. In the meantime, I have attached a link to a paper I wrote on this topic here.

In the process of a Social Innovation Lab, there are a number of steps a group needs to move through in order to come to solutions that are not only innovative, but are acceptable to the group and feasible to implement. I have defined these as stages of exploration below. I have also mapped this to our project team’s work-in-progress idea of facilitating the creation of a National Citizens Energy Strategy (NCES). This will be built out of a set of cross-country, cross-sector engagement processes taking full consideration of economic, environmental and social concerns. The process and delivery designed to be factual, transparent, science based and at the same time accessible and relevant to the general public and will include Social Innovation Lab-style events in addition to a range of public engagement activities. Outcomes will be actionable by all sectors – individuals, corporations, non profits and multiple levels of government. This will produce not just a report that needs to be approved but a set of actions for implementation.

Phase One – Gaining a Shared Understanding of Context

The first phase is gaining a shared understanding of context. Before participants in a Social Innovation Lab can start discussing potential solutions they need to have a common understanding of the state of the system, what the current issues are and the parameters that might be changed. It is also critical that participants share an understanding and agree on the problem to be solved and an early attempt at some criteria for a successful system. From a Design Thinking lens, this is the importance of having the right question. Moura Quayle, Director of the d studio at UBC, highlighted this as a key challenge in their Design Labs (Personal communication, December 19, 2012)

In the context of the NCES, citizens from all backgrounds need a way to explore the current system, ask questions, test assumptions and understand relationships. We have many different actors connected to the energy system in Canada and these actors often have different interpretations of the current state. The NCES must provide ways (using a range of media to engage a range of stakeholders) to understand the current system. Most public dialogue processes are focused on this stage of the process.

Phase 2 – Exploring Alternatives

The second stage in the process requires exploration and is very much a divergent stage. In design thinking parlance, “divergent thinking is the route, not the obstacle, to innovation” (Brown & Wyatt, 2010). A challenge at this stage is encouraging thinking and exploration that pushes the edges of the current system. “The natural tendency of most organizations is to restrict choices in favour of the most obvious and incremental.” (Brown & Wyatt, 2010). A challenge in implementation of this part of the process is that participants will likely arrive with preconceived ideas of the alternatives available. A key goal of the NCES is to expand the solution space – to allow participants to discover alternatives, combine options into new alternatives and start to see the relative benefits of each.

At this stage, participants must be encouraged to think about what are the measures of success – how will the group test the various alternatives that are generated? What might be some variables put in place to evaluate the alternatives and make a decision? What is needed is a set of tools that allow Labs participants to explore policy options and see interactions between variables and system impact (M. Tovey, personal communication, November 29, 2012). A challenge from a technology viewpoint is that while there are a range of extant tools,

“A far smaller portion of simulations leaves it s interface open and clearly explains it limitations so that designers and decision makers can modify the assumptions or the inputs as part of thinking through their response [to] a problem. Even fewer make these capacities so accessible that groups can use them constructively to build and explore models together” (Westley et al., 2012)

What could such a policy explorer look like? One would need software that can evolve with the evolution of system understanding. Tools that can link the visual metaphors of complex adaptive system maps and the basins of attraction hold much promise and will be addressed in later posts.

Stage 3 – Coming to a Decision on Alternatives

The third stage is where participants must come to agreement on which policy intervention, prototype or idea to move forward. In this stage of the process, participants must start to filter and converge on solutions. Tools must be provided to help participants with cost benefit analyses. These tools will also help participants in deciding what are the relevant criteria for evaluation for example incorporating social and environmental impact. Fraser describes a heuristic for determining and visualizing these criteria.

In addressing the challenges of identifying and deciding upon options, Fraser points to a heuristic that starts with looking for “indicators of vulnerability” (E. Fraser, personal communication, November, 2012). Examples of this might be an increasing use of technologies to increase yield or the shift from diverse planting to specialized agriculture. These indicators can be mapped as dimensions on a chart and assigned numeric values. For example, an indicator of agro system resilience might measure ecosystem services like pollination, concentration of production. An indicator of livelihood richness and diversity might leverage the Gini coefficient of income inequality. These indicators can then be plotted over time. When changes start to occur on multiple dimensions, a “convergence of stresses”, the system is at serious risk of collapse and smaller and smaller problems will have bigger and bigger impacts. Interventions may be placed into the categories, in the agriculture example, of technology, management, local food and regulatory. Each category of intervention can then be assessed as to the likelihood of impact the range of indicators of vulnerability. (This section draws on both Fraser, 2007 and Fraser, personal communication, November 26, 2012). However, this is rarely an either or choice. Often a portfolio approach, or bricolage, has the best chance of impacting the system (Gundry et al., 2011). In complex adaptive system especially, it is unlikely that a single intervention focused on a single system variable will have system wide impacts.

Timmer & Dixon propose an alternative taxonomy of decisions in evaluating best bets or system leverage points (V. Timmer, personal communication, December, xx, 2012). In the first category are those innovations that might have the highest quantitative impact. For example, building retrofits might have the biggest impact on reducing CO2 in Canada. However, there may be serious challenges to implementing an innovation of that type. An alternative rubric is to look for areas of accessibility and readiness by asking where there are the least barriers to action, where are the conditions ready, what coming system shocks can we prepare for? Innovations in this space may not have as large a quantitative impact but are more easily adopted and have the potential to prepare the dominant regime for further change. A third category is that of symbolic interventions. Examples such as sharable tool libraries will not overnight change our system of consumption but can be emblematic of a bigger change, in a sense serving as prefigurative action (action that provides a model or early representation of what systemic change could look like) for system change (S. Quilley, personal communication, November, 23, 2012). Finally there are the options that one may choose in order to create, nurture or sustain alternatives. Here we are taking the approach of making the innovation basin of attraction more stable and resilient in preparation for system change.

Various tools and techniques exist for decision making however there are new tools emerging such as Ethelo, designed for large scale public decision making based on principles of “fairness” rather than consensus and tools developed by Chamberlain and Carenini at the Institute for Resources, Environment and Sustainability at the University of British Columbia for multivariate decision making in a visual interface (Personal communication, October 12, 2012). These tools attempt to both use visual techniques for decision making and ensure understandability and accessibility for decision makers.

A difficulty in using these, or any tools, for deciding on which innovation to pursue is the uncertainty caused by the “dance between deliberation and emergence” (F. Westley, personal communication, November 22, 2012). The phrase indicates that a model assuming innovations will remain “pure” once released into the world is fundamentally flawed. There is an inherent tension of examining how well an innovation is designed and assessing its attractiveness (ability to attract resources) and degree of radicalness (likelihood of attracting resistance) on the one hand and on the other the knowledge that innovations may change or be “corrupted”, adopted by the dominant regime (Smith, 2007) or be rejected by the system in the phenomenon of rememberance (the phenomenon of a system reverting to the dominant regime state) (Westley et al., 2006). The innovation may change but the goal is coherence of design rather than consistency, which only works for complicated vs. complex systems (F. Westley, personal communication, November 23, 2012). Once again, Gundry et al. raise the point that no single alternative will be sufficient (2011). Design – not just the elements but the relationship between – is, like bricolage, bigger than the sum of the parts

Phase 4 – Implementation and Evaluation

The final stage, often after the Lab is complete, is implementation. While not technically part of the labs process, if this stage is not considered, the result of all the hard labs work may come to naught. Issues of feasibility (which should have been considered in the earlier design phase) and innovation translation (Smith, 2007) must be addressed. This stage is where the difficulties of having a limited subset of the system in the room become an issue. As will be addressed later in this paper, unless system actors that have direct ability to affect an issue are present, there is serious risk of any outcomes becoming actionable. Born references this approach when recommending that convenings should include representation from the private sector, public sector, non-profit and those with lived experience of the issue being discussed (2008). Note that this is different from stakeholder engagement. Here we are talking about engaging not only those who might need to be consulted due to legislation or regulation, but those that are actively working on system change whether they be activists, institutional entrepreneurs, NGOs, private sector leaders, elected officials or policy makers. A related challenge to this final stage is measurement and evaluation – how will we test our assmuptions and collect data that will inform future decision making?

More on that, along with more background on the three phases next time.

Williams-A Framework for Decision Making in Social Innovation Labs

Leave a comment