6. Measuring Success and Impact
It is often easy to get caught up in carrying out your observation activities and not take the time to step back and look at the bigger picture. This is why it is important to plan regular 'reality checks’ for program staff to sit down and reflect on successes and shortcomings, make informed decisions, and assess the program's progress in achieving its goals. Reflecting on activities and progress allows groups to assess how well their strategies are working and gives groups valuable information upon which to adjust or improve their data analysis and/or advocacy plan.
How effective was a group’s strategy towards achieving their observation goals?
For election observer groups, progress will be measured against a group’s observation goals and objectives, of which open election data advocacy might be one part of their overall strategy. Under Step 1, groups will have defined observation priorities, objectives, and a strategy that are tailored to their political context. Groups might have different priorities around election data interventions, such as:
- Improving access to election data;
- Improving data literacy among government institutions; or
- Conducting data analysis in order to assess the quality of electoral processes.
Indicators or metrics for measuring progress towards goals and objectives will also look different between groups. Generally, SMART (Specific, Measurable, Attainable, Relevant, and Time-sensitive) goals are better than vague goals, as defining concrete and specific goals will keep a group focused and prevent them from feeling overwhelmed. A group’s interventions, including any work around open election data, should contribute to achieving or meeting clear goals.
How effective was the advocacy strategy?
Groups that aim to improve access to election data or improve data literacy among government institutions would design an advocacy strategy to advocate for these changes among relevant stakeholders. Under Step 4, groups will have considered approaches to advocating directly with stakeholders to gain access to specific data or data formats. Part of a group’s advocacy around election data may include encouraging public institutions to release election data according to open election data principles or improving the way data is communicated to the public. In this case, groups would want to measure the success of their advocacy on increasing access to election data.
Consider any lessons learned: what advocacy strategies worked well? What didn’t work well? How will you change the strategies that didn’t work well? How else can you adapt? Are you on track to meet your goals?
Potential indicators to measure effectiveness of an advocacy strategy could include: the number of meetings or consultations with government officials and key stakeholders on the importance of open election data, number of resources produced on the importance of open election data, the reach of social media posts/campaigns of the importance of open election data, and any changes in the availability, or format, of official government data. Any indicators used to measure progress should be specific and tailored to the activities and ultimately build up towards the goals and objectives as defined under Step 1.
How impactful was data analysis and findings?
Once groups acquire data, groups may analyze this data and use these findings as part of a wider election observation effort to promote electoral integrity. Groups should measure the success of open data analysis against their overall goals and objectives set out under Step 1. Reflection on the success and impact of data analysis will also help groups identify what else can be done with this or other data to bolster traditional election observation methods.
Remember that open data analysis won’t always reveal glaring discrepancies or errors in data that should be fixed; finding nothing or very little wrong after analyzing data is a finding on its own.
Consider how well findings from data analysis were used to build accountability or augment other observation analysis. What ways of presenting or otherwise using findings from open data analysis worked well and didn’t? Were findings used by other stakeholders, such as the media, government officials, or other civil society organizations?
Potential indicators to measure the impact of open data findings could include: the number of reports or statements issued, the number of media hits or mentions, the number of recommendations made by your group or other CSOs, and any commitments by government officials to address recommendations made by your group.
Keep track of progress over time
Be aware that what a group may have now is not a guarantee of what they will have in the future. Sustained advocacy is key to creating change around open data. As political leaders, public opinion, and laws and regulations change and evolve, groups will need to adapt to these new challenges and opportunities. As part of this, groups should keep track of progress and impact over time. Measuring success shouldn’t happen only once; groups should continuously and regularly assess the success of their advocacy and data analysis effort to ensure they are measuring impact over time, finding and dealing with possibly unintended outcomes and adjusting strategies to achieve better outcomes. This information can provide useful insights into strategies that can be used in the future for additional advocacy or activities.
Reflect on operations and staffing capacities
An important part of evaluating progress is assessing how the group’s internal capacity affected its ability to implement its open election data plan. These internal capacities, including staffing and operational restraints, may impact a group’s ability to carry out activities and achieve goals.
In Step 3, groups assessed their organizational capacity to use open election data as part of defining an open election data plan. Consider how your internal staffing and operational capacities affected your ability to conduct activities effectively. Are there any improvements in internal expertise or limitations of internal expertise? Did you have the capacities to achieve goals? If not, what could change? What skills, staffing, or other operational considerations are needed to continue progress towards goals? Refer back to Exercise B to think about what may have changed since then.
Review Exercise B: Organizational Capacity to Use Open Election Data
Use Exercise E as a guide to how groups can structure their reflection on their progress towards established goals.