Lessons for Health Program Monitoring and Evaluation in a Low Resource Setting
Numerous guidelines outline best practices for health program M&E. However, health programs are often implemented in less-than-ideal circumstances where these best practices may not be resourced or feasible.
This article describes how M&E has been conducted for a health service delivery improvement program in remote Papua New Guinea and outlines lessons learned. The lessons learned were to integrate M&E into every aspect of the program, strengthen existing health information data, link primary data collection with existing program activities, conduct regular monitoring and feedback for early identification of implementation issues, involve the program team in evaluation, and communicate M&E data through multiple mediums to stakeholders.
These lessons could be applied to other health programs implemented in low resource settings.
- Evaluation in a Low-Resource Setting: Strategies for Success
- Agencies Need to Get Savvy about Low-cost Program Evaluation
- Using Routinely Collected Data for Monitoring and Evaluating Social and Behavior Change Programs
- Budget Friendly M&E: Five Survival Tips for Reporting in Tough Economic Times
- Are Messaging Apps and Emoji-Driven M&E a Game-Changing Innovation?
- The Echo Platform
- Evaluation on a Shoestring Budget
- Informing Social and Behavior Change Programs using Social Listening and Social Monitoring
- Community-Led Monitoring
- How the Echo Platform Helped Drive and Monitor Behavior Change amongst Low-income Teenage Girls
- Data on a Dime: Designing an Effective Monitoring & Evaluation Strategy on a Budget
- Lessons Learned about the Introduction of Sayana Press in Madagascar
- At the Intersection of Inequities- Lessons Learned from CIFOR’s Work on Gender and Climate Change Adaptation in West Africa
- Costs and Cost-effectiveness of Subcutaneous DMPA through Different Delivery Channels: What the Evidence Tells Us
January 9, 2021