In many instances, an SBCC project is asked to design, implement and evaluate SBCC projects in collaboration with local country-based partners. This collaborative implementation process offers tremendous opportunities to nurture SBCC knowledge and skills among local counterparts. However, merely engaging in collaborative implementation does not automatically increase capacity.
For example, simply inviting MOH staff to an SBCC strategy design workshop does not ensure staff will gain SBCC design skills. The collaborative implementation process must include a systematic plan for how capacity will be reinforced throughout the process. To encourage capacity building through the collaborative implementation process, HC3 developed a Project-Based Learning model, which includes experience, review and application.
Project-Based Learning takes advantage of collaborative implementation opportunities to reinforce learning. This experiential, interaction-based learning provides opportunities for professionals to immerse themselves in the process of gaining and applying knowledge directly to a relevant situation in the workplace, even in those cases where capacity strengthening may not be an explicit objective of the program as activities can be implemented at low or even no cost, such as facilitating discussion learning groups on Springboard.
Within a Project-Based Learning approach, professionals gain SBCC knowledge and skills as they are given time and space to practice an activity, reflect on it, and apply their learning alongside SBCC experts who can guide the process. The Project-Based Learning model follows three core steps:
- 1. Planning and Doing
- Perform. Do the activity. Plan for discovery. Create an experience.
• Job aids
• Formal training/ Short course
• Guided discovery
• Structured discussion
• Professional networks
• Books/ Articles
• Video/ Podcasts
• Role-plays/ Drama activities
• Personal stories/Case studies
• Visualizations and imaginative activities
• Team games/ Problem-solving
- For example, a project might work with the MOH to conduct a situation analysis or design a national SBCC strategy. Before performing these activities, provide materials or trainings to help the MOH staff prepare and plan for the activity.
- 2. Reviewing
- Share results, reactions, observations. Process by discussing. Look at experience, analyze, reflect.
• After action reviews
• Briefing sessions
• Discussions/ Reflection in cooperative groups
• Small face-to-face group work
• Email/ Online discussion groups
• Professional networks
• Storytelling, sharing with others
• Reflective personal essays
• Thought questions
• Personal journals, diaries
• (Participant) Presentations
- For example, a project might organize cooperative discussion groups or Springboard forums to discuss the implementation process and what was learned.
- 3. Applying
- Generalize. Connect experience to real world. Apply learning to similar or different situation.
• Application sessions
• Models, analogies and theory construction
• Coaching/mentoring sessions
• New SBCC campaign design and implementation
- For example, a project might provide mentorship and coaching as the MOH applies their learning to a new campaign. Or, HC3 might facilitate an after-action review to discuss how to apply new skills.
This three-step model can be applied at each of the five stages of the program implementation process to encourage learning through practical experience.
An SBCC project can support and facilitate the Project-Based Learning process by providing opportunities to apply the model, developing materials that support learning, offering feedback and encouraging supervisors to create situations to apply newfound learning.
WHAT IS THE TIME HORIZON FOR CAPACITY STRENGTHENING ACTIVITIES?
The time needed for capacity strengthening activities to reach their intended goal(s) will depend on multiple variables. There is no easy formula. Variables include those conditions that are inherent in any capacity strengthening program: the base level of capacity of the intended recipient(s) of the capacity strengthening (whether an individual, organization or system); the level of capacity desired; the amount of resources available for capacity strengthening and thus the intensity of effort of the capacity strengthening; and the level of buy-in for the capacity strengthening, both on the part of the capacity strengthening recipient and also in the recipient’s environment.
In addition, external factors that cannot be predicted at the outset may influence the time horizon: staff turnover, shifts in leadership, significant changes in the workload of the capacity strengthening recipients, political disruption, a crises that demands that capacity strengthening and human resources be diverted for some period of time (the earthquake in Nepal or avian influenza in Egypt), among others. Given all these variables, a capacity strengthening program can be as short as a few hours, or as long as several years.
Given that capacity strengthening is both science and art, it is critical that all stakeholders, including capacity strengthening recipients, capacity strengthening providers, donors, government and others have open communication and share clear expectations of the goals of the capacity strengthening and what is realistically needed to get there. When obstacles or opportunities arise that may shift the time horizon, all stakeholders should be made aware so as to manage expectations and agree on a revised time line or to reassess goals if necessary.
DEFINING AND MEASURING CAPACITY STRENGTHENING SUCCESS
Given the varied and complex nature of programs, success will look different in each case. The definition of success can highlight both the process as well the outcomes of the capacity strengthening efforts. At the most basic level, measuring success around process can center on the achievement of specific activities and outputs, at any of the levels. An even greater measure of success is achieved when a capacity strengthening program can demonstrate the link between its capacity strengthening efforts and certain outcomes, whether intended or unintended.
The challenge lies in measuring outcomes tied to the level at which the specific activity is intervening. For example, a training activity that aims to strengthen SBCC competencies at the individual level can measure success with pre- and post-tests conducted over different points in time. When looking to assess organizational or system-level change, however, readily available tools and traditional monitoring and evaluation approaches are not able to accurately measure success. Depending on the type and scope of the activity, measuring outcomes in these instances call for approaches such as program/organizational documentation review, expert assessment of work outputs, interviews with key stakeholders about observed changes, etc.
Defining outcomes and determining the most appropriate ways to measure success should be a negotiated process among stakeholders, including the recipient(s) of the capacity strengthening, the capacity strengthening provider and the donor. It is critical that a common understanding of success be established at the beginning of the project. For example, if a two-year program is developing the capacity of a fledgling SBCC organization with little experience, then the level of capacity achieved will be well below what might define success for an organization starting at a much higher level of capacity.
The capacity strengthening ecosystem framework illustrates the complex non-linear nature of SBCC capacity strengthening. There are multiple stakeholders at various levels of intervention. In addition, the context of the specific geographic location where capacity strengthening activities take place create unique environments with distinct challenges and realities. As a result, as mentioned above, the task of measuring outcomes resulting from SBCC capacity strengthening requires different approaches than traditional monitoring and evaluation.
One novel approach for monitoring and evaluation of SBCC capacity strengthening is Outcome Harvesting (OH) – a participatory method of assessing programmatic success by identifying both intended and unintended results of programs. OH is well-suited to capture project results in complex situations where the cause and effect of an intervention is unknown or agreement among many stakeholders must be reached in order to finalize and continually adapt an intervention’s strategy. OH is ideal for considering multiple perspectives to decide who and what has changed since the start of an intervention, when and where change has occurred, and how the change came about.
Experiential Learning Theory
http://www.learning-theories.com/experiential-learning-kolb.html for a brief summary
Institutional Economic Theory
Complexity Theory (as applied to capacity strengthening)