What is a Monitoring and Evaluation Plan?
A monitoring and evaluation (M&E) plan is a document that helps to track and assess the results of the interventions throughout the life of a program. It is a living document that should be referred to and updated on a regular basis. While the specifics of each program’s M&E plan will look different, they should all follow the same basic structure and include the same key elements.
An M&E plan will include some documents that may have been created during the program planning process, and some that will need to be created new. For example, elements such as the logic model/logical framework, theory of change, and monitoring indicators may have already been developed with input from key stakeholders and/or the program donor. The M&E plan takes those documents and develops a further plan for their implementation.
Why develop a Monitoring and Evaluation Plan?
It is important to develop an M&E plan before beginning any monitoring activities so that there is a clear plan for what questions about the program need to be answered. It will help program staff decide how they are going to collect data to track indicators, how monitoring data will be analyzed, and how the results of data collection will be disseminated both to the donor and internally among staff members for program improvement. Remember, M&E data alone is not useful until someone puts it to use! An M&E plan will help make sure data is being used efficiently to make programs as effective as possible and to be able to report on results at the end of the program.
Who should develop a Monitoring and Evaluation Plan?
An M&E plan should be developed by the research team or staff with research experience, with inputs from program staff involved in designing and implementing the program.
When should a Monitoring and Evaluation Plan be developed?
An M&E plan should be developed at the beginning of the program when the interventions are being designed. This will ensure there is a system in place to monitor the program and evaluate success.
Who is this guide for?
This guide is designed primarily for program managers or personnel who are not trained researchers themselves but who need to understand the rationale and process of conducting research. This guide can help managers to support the need for research and ensure that research staff have adequate resources to conduct the research that is needed to be certain that the program is evidence based and that results can be tracked over time and measured at the end of the program.
After completing the steps for developing an M&E plan, the team will:
- Identify the elements and steps of an M&E plan
- Explain how to create an M&E plan for an upcoming program
- Describe how to advocate for the creation and use of M&E plans for a program/organization
Estimated Time Needed
Developing an M&E plan can take up to a week, depending on the size of the team available to develop the plan, and whether a logic model and theory of change have already been designed.
Step 1: Identify Program Goals and Objectives
The first step to creating an M&E plan is to identify the program goals and objectives. If the program already has a logic model or theory of change, then the program goals are most likely already defined. However, if not, the M&E plan is a great place to start. Identify the program goals and objectives.
Defining program goals starts with answering three questions:
- What problem is the program trying to solve?
- What steps are being taken to solve that problem?
- How will program staff know when the program has been successful in solving the problem?
Answering these questions will help identify what the program is expected to do, and how staff will know whether or not it worked. For example, if the program is starting a condom distribution program for adolescents, the answers might look like this:
|Problem||High rates of unintended pregnancy and sexually transmitted infections (STIs) transmission among youth ages 15-19|
|Solution||Promote and distribute free condoms in the community at youth-friendly locations|
|Success||Lowered rates of unintended pregnancy and STI transmission among youth 15-19. Higher percentage of condom use among sexually active youth.|
From these answers, it can be seen that the overall program goal is to reduce the rates of unintended pregnancy and STI transmission in the community.
It is also necessary to develop intermediate outputs and objectives for the program to help track successful steps on the way to the overall program goal. More information about identifying these objectives can be found in the logic model guide.
Step 2: Define Indicators
Once the program’s goals and objectives are defined, it is time to define indicators for tracking progress towards achieving those goals. Program indicators should be a mix of those that measure process, or what is being done in the program, and those that measure outcomes.
Process indicators track the progress of the program. They help to answer the question, “Are activities being implemented as planned?” Some examples of process indicators are:
- Number of trainings held with health providers
- Number of outreach activities conducted at youth-friendly locations
- Number of condoms distributed at youth-friendly locations
- Percent of youth reached with condom use messages through the media
Outcome indicators track how successful program activities have been at achieving program objectives. They help to answer the question, “Have program activities made a difference?” Some examples of outcome indicators are:
- Percent of youth using condoms during first intercourse
- Number and percent of trained health providers offering family planning services to youth
- Number and percent of new STI infections among youth.
These are just a few examples of indicators that can be created to track a program’s success. More information about creating indicators can be found in the How to Develop Indicators guide.
Step 3: Define Data Collection Methods and TImeline
After creating monitoring indicators, it is time to decide on methods for gathering data and how often various data will be recorded to track indicators. This should be a conversation between program staff, stakeholders, and donors. These methods will have important implications for what data collection methods will be used and how the results will be reported.
The source of monitoring data depends largely on what each indicator is trying to measure. The program will likely need multiple data sources to answer all of the programming questions. Below is a table that represents some examples of what data can be collected and how.
|Information to be collected||Data source(s)|
|Implementation process and progress||Program-specific M&E tools|
|Service statistics||Facility logs, referral cards|
|Reach and success of the program intervention within audience subgroups or communities||Small surveys with primary audience(s), such as provider interviews or client exit interviews|
|The reach of media interventions involved in the program||Media ratings data, brodcaster logs, Google analytics, omnibus surveys|
|Reach and success of the program intervention at the population level||Nationally-representative surveys, Omnibus surveys, DHS data|
|Qualitative data about the outcomes of the intervention||Focus groups, in-depth interviews, listener/viewer group discussions, individual media diaries, case studies|
Once it is determined how data will be collected, it is also necessary to decide how often it will be collected. This will be affected by donor requirements, available resources, and the timeline of the intervention. Some data will be continuously gathered by the program (such as the number of trainings), but these will be recorded every six months or once a year, depending on the M&E plan. Other types of data depend on outside sources, such as clinic and DHS data.
After all of these questions have been answered, a table like the one below can be made to include in the M&E plan. This table can be printed out and all staff working on the program can refer to it so that everyone knows what data is needed and when.
|Number of trainings held with health providers||Training attendance sheets||Every 6 months|
|Number of outreach activities conducted at youth-friendly locations||Activity sheet||Every 6 months|
|Number of condoms distributed at youth-friendly locations||Condom distribution sheet||Every 6 months|
|Percent of youth receiving condom use messages through the media||Population-based surveys||Annually|
|Percent of adolescents reporting condom use during first intercourse||DHS or other population-based survey||Annually|
|Number and percent of trained health providers offering family planning services to adolescents||Facility logs||Every 6 months|
|Number and percent of new STI infections among adolescents||DHS or other population-based survey||Annually|
Step 4: Identify M&E Roles and Responsibilities
The next element of the M&E plan is a section on roles and responsibilities. It is important to decide from the early planning stages who is responsible for collecting the data for each indicator. This will probably be a mix of M&E staff, research staff, and program staff. Everyone will need to work together to get data collected accurately and in a timely fashion.
Data management roles should be decided with input from all team members so everyone is on the same page and knows which indicators they are assigned. This way when it is time for reporting there are no surprises.
An easy way to put this into the M&E plan is to expand the indicators table with additional columns for who is responsible for each indicator, as shown below.
|Indicator||Data source(s)||Timing||Data manager|
|Number of trainings held with health providers||Training attendance sheets||Every 6 months||Activity manager|
|Number of outreach activities conducted at youth-friendly locations||Activity sheet||Every 6 months||Activity manager|
|Number of condoms distributed at youth-friendly locations||Condom distribution sheet||Every 6 months||Activity manager|
|Percent of youth receiving condom use messages through the media||Population-based survey||Annually||Research assistant|
|Percent of adolescents reporting condom use during first intercourse||DHS or other population-based survey||Annually||Research assistant|
|Number and percent of trained health providers offering family planning services to adolescents||Facility logs||Every 6 months||Field M&E officer|
|Number and percent of new STI infections among adolescents||DHS or other population-based survey||Annually||Research assistant|
Step 5: Create an Analysis Plan and Reporting Templates
Once all of the data have been collected, someone will need to compile and analyze it to fill in a results table for internal review and external reporting. This is likely to be an in-house M&E manager or research assistant for the program.
The M&E plan should include a section with details about what data will be analyzed and how the results will be presented. Do research staff need to perform any statistical tests to get the needed answers? If so, what tests are they and what data will be used in them? What software program will be used to analyze data and make reporting tables? Excel? SPSS? These are important considerations.
Another good thing to include in the plan is a blank table for indicator reporting. These tables should outline the indicators, data, and time period of reporting. They can also include things like the indicator target, and how far the program has progressed towards that target. An example of a reporting table is below.
|Indicator||Baseline||Year 1||Lifetime target||% of target achieved|
|Number of trainings held with health providers||0||5||10||50%|
|Number of outreach activities conducted at youth-friendly locations||0||2||6||33%|
|Number of condoms distributed at youth-friendly locations||0||25,000||50,000||50%|
|Percent of youth receiving condom use messages through the media.||5%||35%||75%||47%|
|Percent of adolescents reporting condom use during first intercourse||20%||30%||80%||38%|
|Number and percent of trained health providers offering family planning services to adolescents||20||106||250||80%|
|Number and percent of new STI infections among adolescents||11,00022%||10,00020%||10% reduction 5 years||20%|
Step 6: Plan for Dissemination and Donor Reporting
The last element of the M&E plan describes how and to whom data will be disseminated. Data for data’s sake should not be the ultimate goal of M&E efforts. Data should always be collected for particular purposes.
Consider the following:
- How will M&E data be used to inform staff and stakeholders about the success and progress of the program?
- How will it be used to help staff make modifications and course corrections, as necessary?
- How will the data be used to move the field forward and make program practices more effective?
The M&E plan should include plans for internal dissemination among the program team, as well as wider dissemination among stakeholders and donors. For example, a program team may want to review data on a monthly basis to make programmatic decisions and develop future workplans, while meetings with the donor to review data and program progress might occur quarterly or annually. Dissemination of printed or digital materials might occur at more frequent intervals. These options should be discussed with stakeholders and your team to determine reasonable expectations for data review and to develop plans for dissemination early in the program. If these plans are in place from the beginning and become routine for the project, meetings and other kinds of periodic review have a much better chance of being productive ones that everyone looks forward to.
After following these 6 steps, the outline of the M&E plan should look something like this:
- Introduction to program
- Program goals and objectives
- Logic model/Logical Framework/Theory of change
- Table with data sources, collection timing, and staff member responsible
- Roles and Responsibilities
- Description of each staff member’s role in M&E data collection, analysis, and/or reporting
- Analysis plan
- Reporting template table
- Dissemination plan
- Description of how and when M&E data will be disseminated internally and externally
Tips & Recommendations
- It is a good idea to try to avoid over-promising what data can be collected. It is better to collect fewer data well than a lot of data poorly. It is important for program staff to take a good look at the staff time and resource costs of data collection to see what is reasonable.
Glossary & Concepts
- Process indicators track how the implementation of the program is progressing. They help to answer the question, “Are activities being implemented as planned?”
- Outcome indicators track how successful program activities have been at achieving program goals. They help to answer the question, “Have program activities made a difference?”
Resources and References
Evaluation Toolbox. Step by Step Guide to Create your M&E Plan. Retrieved from: http://evaluationtoolbox.net.au/index.php?option=com_content&view=article&id=23:create-m-and-e-plan&catid=8:planning-your-evaluation&Itemid=44
infoDev. Developing a Monitoring and Evaluation Plan for ICT for Education. Retrieved from: https://www.infodev.org/infodev-files/resource/InfodevDocuments_287.pdf
FHI360. Developing a Monitoring and Evaluation Work Plan. Retrieved from: http://www.fhi360.org/sites/default/files/media/documents/Monitoring%20HIV-AIDS%20Programs%20(Facilitator)%20-%20Module%203.pdf
Banner Photo: © 2012 Akintunde Akinleye/NURHI, Courtesy of Photoshare