Knowledge Sharing for Democracy Assistance
Center for International Private Enterprise
Non Governmental Organization (NGO)
Describe the KM initiative
The Center for International Private Enterprise (CIPE) Knowledge Management initiative shares lessons, strategies, and case studies among CIPE’s partner network and a diverse range of governance reform leaders. Established in 2005, the program helps amplify CIPE’s impact and fosters organizational learning. This initiative focuses on lessons linked to the mission of strengthening democracy around the world through private enterprise and market-oriented reform. Staff relies on knowledge management for program development and training materials, while private sector partners look to CIPE for ideas and models in their efforts. Key objectives of the initiative are: (1) To capture lessons learned in program design and implementation in order to improve the effectiveness of future CIPE programs; and (2) To increase cross-regional sharing of lessons learned and successful approaches to building institutions of democratic free-market systems. Knowledge sharing at CIPE operates within the context of a partnership model. Typically, partner organizations formulate local solutions to local challenges. CIPE provides them with management assistance, training, and financial support. In more than 25 years’ experience, CIPE has found local knowledge to be a critical success factor, as well as a key source of CIPE’s accumulated body of international knowledge. In sharing knowledge, CIPE does not attempt to provide unique answers; it instead provides analytical tools and a selection of approaches which local partners can adapt. Key elements of CIPE’s knowledge management effort have included toolkits, guides, and case studies. The initiative has also supported CIPE’s program development, strategy, and evaluation functions, and promoted communities of practice.
Describe the approaches utilized to measure / assess this KM initiative
CIPE submits final evaluation reports to the National Endowment for Democracy on results from each annual knowledge management project. CIPE reports according to evaluation indicators presented in the project proposal, which are based on project objectives. These evaluations rely primarily on surveys of program staff, partners, and other users, as well as tracking of publications’ usage. From the standpoint of this initiative, the application of knowledge is the most meaningful and practical measure of success, for several reasons. First, knowledge that is captured and not applied cannot be said to have impact. Second, the ultimate effects of knowledge sharing are hard to assess—it is hard to determine precisely which pieces of information are absorbed, what they mean to the recipient, and to what extent they modify behavior. Third, CIPE recognizes that effective local solutions tend to be diverse and that shared knowledge is best adapted to its context—CIPE therefore does not look for wholesale replication of particular models. If knowledge is being applied, we know at least that the users find value in it and are able to put it to use. We can infer, though we cannot prove, that the applied knowledge influences project outcomes, especially when it involves the application of key lessons into project designs. Other relevant measures include the relevance, quality, and accessibility of knowledge. Surveys and statistics are best used in combination to assess different aspects of knowledge users’ behavior. For instance, closed survey questions and statistics measure usage patterns, but open-ended survey questions can reveal particular significant applications. The former inform future programming, while the latter indicate how the knowledge initiative matters. In addition to evaluation indicators linked to project objectives, CIPE uses impact measures linked to CIPE’s organizational objectives. This methodology is described in the 25-Year Impact Evaluation. That report noted that certain projects, such as knowledge management, support organizational functions which contribute to impact in the field. The Knowledge Management Officer assesses and reports on the program in accordance with the evaluation plan in the project proposal. The Evaluation Officer reviews all project reports.
What was the purpose or motivation for assessing this KM initiative?
Evaluation reports are required by the donor, the National Endowment for Democracy. CIPE has performed project evaluations since its early years. Annual evaluations of knowledge management guide the ongoing improvement of the initiative, assess demand for resources, and capture the value of the initiative to CIPE and its mission.
What were the most important lessons learned about the assessment process?
The assessment should be appropriate to the project. We have found that a modest, repeated assessment process oriented toward project objectives can help refine the knowledge management initiative as it evolves. CIPE’s initiative serves a variety of audiences and projects. While its components are designed with particular applications in mind, it has been important to be open in gathering and using evaluation information. Since our partners and other users incorporate CIPE’s knowledge resources into programs of their own design, the key has been to capture applications rather than to test pre-conceived outcomes. Through assessment, we have learned how and where knowledge resources are being used, including specific program applications for advocacy and training by partners and other NGOs, and applications by leading development and policy organizations. The information has allowed CIPE to tailor its product formats and themes. Longer-term follow-up has been important, moreover, because CIPE’s key publications have a long shelf life. One constraint has been the demands on partners’ time. Partners report frequently on their projects and other information requests, so we must balance evaluation needs against time for project implementation and knowledge sharing. Separately, evidence from CIPE’s 25-Year Impact Evaluation indicates that access to information programs are often associated with policy impact when they are integrated with other activities related to advocacy. In this light, knowledge management should be assessed with regard to its linkages with related programs and organizational processes.
What would you do differently next time?
Interviews with programs staff or focus groups with partners could assess the ways in which people obtain their knowledge and where they look for lessons or models. A session on this topic with Middle Eastern partners was insightful. Where possible, it is desirable to collect continuous feedback on knowledge sharing activities and tools. Opportunities for regular interaction with staff and partners reveal information and applications that might not turn up in a survey. Coordination with other projects that support access to information could allow a degree of benchmarking and exchange of lessons.
What advice would you give to others based on your experience?
Start with the fundamentals. A knowledge management initiative should reflect its organization’s mission, objectives, and strategy. Keep it simple. Many users are drowning in information and need a clear starting place. Invest in outreach. Knowledge must be promoted in order for audiences to be aware of it. Develop relationships with key audiences and translate resources. Integrate knowledge sharing with programs. Develop a culture and processes to encourage sharing throughout all programs. Be willing to experiment. One sometimes discovers what works by trying something new. Give users choices. They can decide which tools or examples are most relevant to their context.
What do you think are the main unanswered questions or challenges related to this field of work?
How can a knowledge management initiative evolve? For example, how might an initiative balance innovation and new objectives against maintaining a knowledge infrastructure and meeting daily information needs? How have successful initiatives developed past their early stages?