27 Apr 2016

The Vaccination Landscape: Changes and Challenges

What was the last vaccination you received? The one before that? When did you receive them?  Where was the vaccine administered – your arm? Your thigh? The right or the left? For most of us, this is not easy information to remember. And yet it’s what we ask many women to recall for their children in DHS interviews.

A child in Lhoksemawe, Aceh, Indonesia, receives a vaccine injection.

© 2012 Armin Hari/INSIST, Courtesy of Photoshare

Since 1984, The DHS Program has collected data on vaccinations in over 80 countries.  During this time, the vaccination landscape has changed dramatically. Initially, BCG, DPT (Diphtheria; Pertussis, or whooping cough; and Tetanus), Polio, and Measles were the only childhood vaccines most countries administered. These data were collected only from vaccination cards.

As time went on, our methodology expanded to include data collection from mother’s recall to clarify incomplete vaccination cards. In cases where there is no vaccination card, mothers are asked whether or not her child received each type of vaccine and the number of doses.

Today, the number of vaccines standardly administered is much higher. This is good news — it means millions more lives saved — but it introduces data collection challenges.

The vaccination data collected in a standard DHS questionnaire are far more elaborate than any time in our history: BCG remains; for Polio a birth dose is added; DPT is now combined into a pentavalent vaccine with Hepatitis B and Haemophilus influenza type B (Hib); 3 doses of a pneumococcal vaccine and 3 doses of a rotavirus vaccine are now included; and the measles vaccine has been replaced with a measles combination vaccine. For countries that are transitioning between vaccination schedules or have more complicated schedules, the landscape becomes more challenging to navigate.

Data collection is relatively straightforward when a child has an up-to-date and complete vaccination card.  But in many cases, changing vaccine schedules and inconsistent record-keeping render the cards incomplete or unclear. Worse, vaccination cards are often missing or otherwise unavailable.

Indeed, a recent study revealed that in 4 of the 10 countries with the largest birth cohorts that had carried out either a DHS or MICS survey in 2010-2013, less than 50% of children had home-based vaccination records.

A happy mother shows her child's vaccination card in New Delhi, India.

© 2012 Bhupendra/MCHIP, Courtesy of Photoshare

When a vaccination card is unavailable, it is the mother who is expected to fill in the gaps.  But a mother’s recall is not 100% reliable. Keeping track of vaccines for multiple children and for combination vaccines is challenging enough, but even more so if the mother isn’t present for every vaccination event.

Knowing whether newer and existing vaccines are reaching their target population and doing so on schedule is valuable information to many. What can be done to maintain and even improve data quality when the complexity of the data needs on vaccine coverage continue to grow?

The DHS Program, in collaboration with the World Health Organization (WHO), United Nations Children’s Fund (UNICEF) and other experts in the field, are pursuing several options. One involves visiting health facilities to compare the information collected during the interview to the vaccination information recorded at the health facility.

This process has been used previously in Central Asian countries and in Albania, where facility-based documentation is strong. It is now being pilot-tested in Ethiopia; however, the method of record-keeping varies by location and includes instances where records are kept based on the date of visit and not based on the child’s name or date of birth. Country-specific challenges such as these require additional flexibility and coordination between survey implementers, Ministries of Health, and health facilities.

There is unlikely a one-size-fits-all solution to the challenge of accurately measuring vaccination coverage.  But from the perspective of global health, this is a good problem to have.  More children worldwide are being protected from a host of illnesses.  We are proud to be sharing data to help track progress towards closing the gap as our implementing partners reach more children with a larger variety of vaccines.

20 Apr 2016

From the Field: 2014-15 Uganda Malaria Indicator Survey (UMIS) Team

From left to right: Patrick, Aziza, Irene N., Doreen, Persis, Irene B. with Uganda Bureau of Statistics (UBOS) survey vehicle

During fieldwork for a household survey, survey teams visit households that are selected to represent an entire country. Respondents to the survey are as diverse as the country and live within mountains, valleys, deep in forests, and busy urban centers. These respondents allow survey teams into their homes to answer questions about themselves, their families, and their lives. While I consider myself lucky to have the opportunity to meet and talk to so many people during survey fieldwork, there are certainly many challenges.

For the fieldwork phase of the 2014-15 Uganda Malaria Indicator Survey (UMIS), I spent a day with Patrick, Aziza, Irene N., Doreen, and Persis as they conducted interviews and tested children under 5 for anemia and malaria. Despite the challenges and even some homesickness, the team worked hard to collect data important to Uganda while enjoying the chance to travel throughout their country, make friends, treat children for malaria, and engage with different communities.

Patrick, Lab Technician

“When you test a person’s child and actually find he has malaria, at the end of the day you give them treatment and the guardians are usually grateful. You feel like you’ve helped out.”


AzizaAziza, Interviewer

“It has been hectic. It hasn’t been easy. But at the end of the day we get data, even when you are very tired!”

“I’ve gotten the chance to educate women in the village… This is a way we connect with people in the village.”


Irene-NIrene N., Interviewer

“Most times, we wake up at 6 so we can be on the road by 7 after breakfast. Then, we get in the field by 8, so each interviewer does 5 to 7 households and then test about 16 children in a day.”


Doreen, Nurse/Interviewer

“We realized that malaria is still a major problem. People are suffering. Young children under five are really suffering from malaria and also anemia.”

“It has actually given us an opportunity to appreciate and learn more about our communities, because you would not have ever imagined that malaria really exists and is killing so many people until you are there, testing and seeing positive rapid diagnostic tests (RDTs).”

Persis, Supervisor

“My motto is, ‘I don’t give up’ … when it comes to work I do it with all my heart. I don’t compromise work, I am really mindful of the quality at the end of the day.”

“I really wanted to work on the malaria survey because health is the first and foremost priority… I believe our work is good.”

The 2014-15 Uganda Malaria Indicator Survey (UMIS) was released on November 6th, 2015, and is the 2nd UMIS as part of The DHS Program. Fieldwork took place from early December 2014 to late January 2015. There were 17 teams for field data collection; each field team included 1 field supervisor, 3 interviewers (1 of whom was a nurse), 2 health technicians, and 1 driver. A total of 5,345 households were interviewed. The 2014-15 UMIS was implemented by the Uganda Bureau of Statistics (UBOS) and the National Malaria Control Programme (NMCP) of the Uganda Ministry of Health.

13 Apr 2016

How Many Demographers Does It Take to Make a Great Visualization?

How much time do you budget to create a data visualization?  The best visualizations, though they appear to be simple and clear, are often the result of dozens of attempts.

Demographers spend countless hours crunching data and preparing journal submissions, but not all take full advantage of data visualization, either in their exploratory analysis, or in communication of their findings.  Last month, data visualization enthusiasts met at the Population Reference Bureau for a hands-on workshop as part of the Population Association of America (PAA) Conference.  The 4 hour interactive workshop featured presentations from DC-based data viz expert Jon Schwabish, Dr. Tim Riffe, demographer at the Max Planck Institute for Demographic Research (MPIDR), Jonas Schoeley (MPIDR), and Dr. Audrey Dorelien of the Minnesota Population Center.  While each presenter had a unique focus, a common theme was clear:  your first draft visualization should never be your final visualization.  This lesson was put into practice as participants shared works-in-progress, received constructive feedback, and prepared “makeovers”.

Clara Burgert and I have been working on a visualization project for over a year.  The original was published last summer but we’ve been reworking it for a journal submission. Our colleagues at the data viz workshop provided constructive feedback, and we have made yet another round of changes. Some of the many stages of our chart “makeover” are presented below.


Clara’s recently published analysis looks at 27 countries and 6 child health indicators. The goals of our visualization were to compare countries across these 6 indicators and to illustrate the inequity within countries, by highlighting the worst and best performing sub-national regions. While some countries have a very high measles vaccination prevalence, such as Tanzania, there are regions in Tanzania that are performing very poorly. Meanwhile, other countries have moderately good vaccination rates with very little variation among regions (like Rwanda). Our first real attempt at a publishable graphic looked like this:

indicators for journal

One of the challenges with this first graphic was that it didn’t use color very well. Clara needed to use color to distinguish between the 6 indicators in other places in the report, so we wanted to integrate that color scheme here for consistency. Simultaneously we realized that we could also simplify our use of color in this first draft: while we had originally plotted the red circle as the lowest region, the reader doesn’t need that color to know that that plot is the lowest- it’s obvious based on the axis and the left-to-right understanding of a numerical timeline. So we tried this:

indicators for journal with color coding

This color scheme worked better to unify the other graphics in the report, and we were feeling pretty good about it. But we still had a few concerns and questions:

  • Was it okay to have the axis for the stunting indicator and under-five mortality the same size as the others even though they aren’t at the same scale?
  • Was it okay that we were sorting lowest to highest, instead of ordering countries in a consistent way?
  • How should we handle ordering of the data when for 4 of our indicators, a high data value is “good”, like vaccination coverage, while for 2 of our indicators, a high data value is bad, like mortality?
  • Were there any formatting tweaks we could make to improve readability?

It was this version that was shared at the PAA data visualization workshop. Through the feedback of experts and colleagues, we made some final decisions:

  1. Change the axis of the stunting indicator to go to 100% so that it is consistent with the other percentages in the graphic. Some suggested that we move stunting and under-five mortality to a separate page to visually remind readers that the interpretation of these indicators is different (i.e., high values are bad). Ultimately, we decided that the layout of the 6 indicators was better for us in terms of publication, but agree that this is a trade-off and may confuse some less technical audiences.
  2. We decided to keep our sorting from low to high, as the main audience for this paper is looking at general trends, not for data for a specific country. However, reports by The DHS Program often have many audiences, and with that in mind, we created an additional graphic (not shown) that summarizes each of the indicators by country so that a stakeholder in Ghana can see his or her relevant data in one view, without searching for Ghana in each of the above graphics.
  3. Jon Schwabish had some quick and practical suggestions for making this graphic easier to read. His critique that it felt “heavy” resonated with us as the creators. He suggested thinning out the lines and substituting the big “X” marking the national average with a smaller circle.

6 indicators for journal April 4

There is a science to data visualization, but there is also a lot of subjectivity. Many solutions can be found only through trial and error. Often it takes time, several new sets of eyes, and dozens of drafts to settle on the best possible visualization for your data. While this is a big investment, there is growing evidence that it’s worth it. We are competing for just 1 or 2 minutes of our audience’s attention in a world filled with data and information. We hope to create a few visualizations that are worth stopping to explore.


07 Apr 2016

Measuring health care: The Service Provision Assessment Survey


When DHS and other population-based surveys indicate potential problems with a country’s health care systems, such questions are raised:

“Are certain services available in health facilities?”
“What is the quality of those services?”
“Are there factors at the service delivery level that could be contributing to the problems?”

The Service Provision Assessment (or SPA) survey attempts to bridge this gap while fulfilling the need to monitor health systems strengthening in surveyed countries.

Let’s say a national strategy is initiated to address a growing obesity problem and its associated issues, diabetes and hypertension. A component of the strategy may focus on improving a country’s health facilities by increasing the number that have diabetes services available.

2014-15 Tanzania SPA Key Findings

2014-15 Tanzania SPA Key Findings

It may also strengthen readiness of those facilities to provide quality services – more staff who are up-to-date on trainings for provision of diabetes services, more equipment (such as blood pressure apparatuses, adult weighing scales, and height boards), improved diagnostic capacity (the ability to conduct blood glucose and urine protein tests), and increased availability of medicines to manage diabetes. These are all indicators a SPA survey provides.

The improvements in service availability and readiness may lead to early identification of risk factors, early diagnosis and initiation of management, and, perhaps, a gradual decline in unmanaged diabetes.

2014-15 Tanzania SPA Key Findings video series
The SPA survey is designed to collect information from a sample of functioning health facilities in a country on the availability of services, readiness of facilities to provide health services in many areas, and measures of quality of care. Four different questionnaires are used to collect data at the facility, provider, and client levels. Survey data collection is done by teams of health workers.

If the DHS is a snapshot of a population’s health, the SPA is a snapshot of the service environment and those who provide and receive services, which drives population health. Though it can be challenging to directly link health facility data with population data, the SPA is useful in providing support and context to the DHS.

The first SPA surveys took place in Guatemala, Kenya, and Bangladesh in the late 1990’s, and continue to be implemented today. To date, 22 SPA surveys have been conducted, the latest being the 2014 Bangladesh Health Facilities Survey and 2014-15 Tanzania SPA. Ongoing surveys include the 2015 Nepal SPA and Senegal Continuous SPA. Be the first to know when those will be available (along with all other surveys) by signing up for email alerts, or by following us on Facebook and Twitter

The information provided on this Web site is not official U.S. Government information and does not represent the views or positions of the U.S. Agency for International Development or the U.S. Government.

The DHS Program, ICF
530 Gaither Road, Suite 500, Rockville, MD 20850
Tel: +1 (301) 407-6500 • Fax: +1 (301) 407-6501

Anthropometry measurement (height and weight) is a core component of DHS surveys that is used to generate indicators on nutritional status. The Biomarker Questionnaire now includes questions on clothing and hairstyle interference on measurements for both women and children for improved interpretation.