Assess, Adapt and Sustain

Step 12

The utility of the SMSS resides in its capacity to provide timely data of good quality that can be used for accurate monitoring of national and subnational health impact. An assessment process should be incorporated into the design so that data quality evidence is regularly generated. The assessment should be planned with goals of reviewing 1) the data collection process, 2) the completeness and timeliness of data collected, and 3) its comparability with other data. The assessment provides an opportunity to update the population listing to make sure the study has an accurate denominator for all calculations. It is also an opportunity to engage communities and stakeholders. This will promote buy-in and adoption of the system, and it will facilitate the funding and resource mobilization needed to sustain the system. In this section, we briefly discuss the steps for implementing an assessment, how to adapt the system, and continue to promote its sustainability.

Key Points

For the Assessement, the most important question is whether the SMSS is generating data of good quality consistent with its set objectives and that it is responding to the needs of the Ministry of Health and other stakeholders. Three complementary strategies can be implemented to answer this question appropriately:

  1. Internal data quality assessment, including timeliness and completeness.
  2. External data quality assessment, comparison with other existing data sources.
  3. Primary retrospective data collection compared with the prospective data reported by the system.

Internal Data Quality Assessment

This consists of a systematic assessment of trends of the monthly number of births and deaths reported. The trends should be reviewed by the date of the events and the reporting. The review by date of the events allows for an understanding of seasonal patterns of events and the identification of inconsistencies and outliers. The review by date of report allows for an assessment of whether the field data collectors are continuously and consistently reporting into the system over time. A longer term (12 months or more) retrospective review of reporting by clusters also allows spotting clusters where data have not been consistently reported. The internal data quality review will also examine reporting by administrative region or subnational areas of interest and the effects of any known natural disasters, conflicts or other crises. In addition to the assessment of the data reporting, consistency in the patterns of events reported and statistics generated should be examined. For example, examining the distribution of deaths by age can help assess whether specific age groups are under-reported, as commonly seen with stillbirths and neonatal deaths, and whether “age heaping” exists, meaning rounded ages are reported instead of exact ages. For verbal autopsy assessment, comparing the volume of deaths for which VA interviews are conducted to the total deaths reported into the system is essential. Finally, an internal assessment will examine the plausibility of the mortality statistics generated from the data.

External Data Assessment:

This assessment consists of comparing the total number of events and statistics generated from the data to external existing data or known statistics, such as CRVS. A straightforward assessment is to compare the total number of annual births and deaths expected within the system based on known estimates of crude birth and death rates, such as from a population census or national surveys such as a Demographic and Health Surveys, and the total population of the sample in the SMSS. Such a comparison is useful to understand the possible level of completeness of events reporting. Another layer of assessment consists in comparing all-cause and cause-specific mortality rates by age, or childhood mortality rates with those from recent surveys or other existing systems that are known to produce accurate mortality estimates.

Primary Retrospective Data Collection:

Keeping track of the total population of the SMSS can be challenging given population movement. We recommend conducting a regular population and household census every two to three years to obtain accurate estimates of the total population. When planned, this census can provide an opportunity to also collect retrospective data on events in the past one or two years. This data must be collected by an external well-trained team that visits households within each cluster to collect the data. To compare the assessment data with the SMSS data at an individual event level, care must be taken to ensure that the collection of identifying information will allow for linking the households, household members and events. This can be challenging if the SMSS has not established a strong household identification system, which is often difficult. In the case of SIS-COVE in Mozambique, the assessment data collection included a printout of all households in each cluster which data collectors carried with them to conduct the matching while in the field. The match is done either based on existing household identification or by the name of the head of the household. The community surveillance agents helped in the identification of the household during the data collection. For existing households for which there is a match on their list, the data collector also confirms the members of the household from a database on their tablet. The IT system established in Mozambique allows tracking households through GIS coordinates and helps monitor the completeness of the data collection. This comprehensive data collection allows obtaining an updated population by age and sex for the system but also provides an opportunity to confirm the cluster boundaries. Furthermore, the population data obtained can be analyzed to determine the continued representativeness of the sample by comparing the data to an external representative data source, such as from a population survey or national household survey such as DHS.

Adapt

It is important that the system grow and adapt to stay relavent to stakeholder priorities. To adapt the system effectively, you first need to understand the weaknesses of the system and where it is failing to meet goals. Is the system addressing the needs of the Ministry of Health, the CRVS and stakeholders. Is the design and sample strong enough to produce nationally and sub nationally representative estimates of mortality and causes of death? To answer these questions, you’ll need to collect feedback from the MoH and stakeholders and consider adapting the system to collect additional data to respond to their needs. This feedback can be collected during the presentation of results in stakeholders’ meetings and other scientific or policy-oriented forums. The ability to respond to key stakeholders’ needs will also help promote the sustainability of the system by creating stronger buy-in and leveraging financial support from donors.

Another more technical aspect of adapting the SMSS is ensuring that the design and the sample remain strong and can generate representative mortality estimates with acceptable precision. Given mortality decline over time and epidemiological changes leading to changes in cause of death patterns, the SMSS sample may become too small for estimating mortality within some predefined domains. By collecting and analyzing the assessment data (described above), an examination of any sample distortions can be done at national and subnational level. Based on levels of all-cause and cause-specific mortality, updated sample size calculations may be carried out to assess the size of the current SMSS sample. It may also be that over time, specific programs within the MoH or other stakeholders will desire to obtain estimates at other domains not initially included in the sample design. Responding to such demand will require a readaptation of the sample.

Sustain

Discussions of the sustainability of the SMSS must start from the initial conception and definition of the system, and it should continue throughout implementation, data integrating to CRVS, data sharing and dissemination. While securing long-term financing is key to sustainability, the system must continue to demonstrate its utility for the country. Financial sustainability can be achieved by leveraging development partners and multilateral or bilateral donors, as well as through gradual domestic financing. The governance of the system and its institutional leadership are critical for development of proposals for successful fundraising. Demonstrating the utility of the SMSS demands sharing the data produced, linking/integrating the system to other existing systems, particularly the CRVS, and demonstrating the quality of the estimates generated.

To Learn More

Mozambique Sample Vital Statistics System: Filling the Gaps for Mortality Data – A commentary from the 2023 COMSA supplement in The American Journal of Tropical Medicine and Hygiene

Implementing the Countrywide Mortality Surveillance in Action in Mozambique: How Much Did It Cost? – An original research article from the 2023 COMSA supplement in The American Journal of Tropical Medicine and Hygiene

From External to Local: Opportunities and Lessons Learned from Transitioning COMSA-Mozambique – An original research article from the 2023 COMSA supplement in The American Journal of Tropical Medicine and Hygiene

Completeness and factors affecting the community workers’ reporting of births and deaths in the countrywide mortality surveillance for action in Mozambique – A peer-reviewed paper describing the completeness of reporting community vital events


Sustainability Lessons from the Mozambique COMSA/SIS-COVE Strategy

While from the outset an SMSS can designed, financed, and implemented by country governments, often the initial years of setting up the system are externally funded with external technical assistance. Under such scenarios it is essential that a transition plan is developed to ensure full handover of the system to country leadership and ownership. We discuss below lessons learned during the transition process for COMSA/SIS-COVE.

Design the system with an eventual transition in mind: Have the intention for transition at project inception informs the choices that are made as the system is developed. For example, embedding the system with local institutions means it will be better aligned to local structures and have a more secure footing towards sustainability. Likewise, leveraging relationships with other ministries/agencies can support institutionalization upon transition.

Approach capacity building strategically for both the program’s immediate needs as well as post-transition management: This will likely require identifying and addressing local capacity needs over the life of the project.

Be careful about programmatic shortcuts that must be unwound when transition time comes: Working within local structures, especially within government, can be more complicated and takes longer to develop so shortcuts to kick off implementation promptly can be appealing; however, often those shortcuts have to be retrofitted or redesigned later.

Partners need to agree early about the specifics of transition: In particular, which aspects of the program are prioritized for transition and where adaptations will be necessary or preferred to ensure the longevity of the program. These decisions drive later considerations around funding, stakeholders, institutionalization, etc. Transition planning requires effort, strategy, and broad agreement on the goals:

  • Specific attention, investment, and time should be allocated to transition processes to minimize negative consequences on program objectives.

  • If transition is the goal from the outset, considerations about future funding should always be on the agenda so that both local resources as well as external ones can be fostered.

  • Stakeholder management is critical to ensure current stakeholders remain committed to the program, and new or potential stakeholders can be brought on board to support the program.

SIS-COVE’s efforts to distribute, build and restructure analytical and information technology responsibilities between partners from project inception through transition is an example that draws on all these principles.

Last updated
May 7, 2025

Table of contents


Copyright © 2025 Johns Hopkins University
Contact Us: viva@jh.edu