By Salem Solomon | Casey Frechette,
In a 2014 press release, the U.N. reported that 5,000 people crossed from Eritrea to neighboring Ethiopia in October of that year. It was an alarming surge of people fleeing a country with indefinite military service and severe constrictions on personal freedom.
In a more recent, widely publicized report, the U.N. stated that “overall, it is estimated that 5,000 people leave Eritrea each month, mainly to neighbouring countries.” Annually, this outflow represents about 1 percent of the country’s population and accounts for a significant portion of refugees traveling to Europe.
The U.N. has not published or provided the authors with specific sources for its statistics on Eritrean refugees. Press releases become primary points of reference, despite lacking details on methodologies. But dozens of international news outlets have repeated the assertion that 5,000 people flee the country every month. The claim has appeared in Al Jazeera, BBC News, CNN, Fox News, Quartz, Reuters, the Guardian, the New York Times, the Wall Street Journal and many other prominent outlets.
The number appears in dozens of news stories, often without attribution. It is cited extensively by international outlets covering the migration crisis, often as a leading piece of evidence that Eritrean migration is a key contributor to the global refugee crisis.
But it may be completely wrong.
Examining where the number comes from and why it is repeated with such frequency reveals something significant about how journalists handle non-governmental and nonprofit sources. More importantly, it shows why the usual verification safeguards can break down when reporters deal with advocacy groups.
Why 5,000 Eritreans may not flee the country each month
At face value, the 5,000 claim rings true. Eritrea’s open-ended military conscription and constricted liberties have been well-documented. With few prospects for the future, Eritreans — especially the youth — have reasons to leave. But that doesn’t prove the 5,000 statistic, and there are several reasons to question its veracity:
(1) The U.N. itself has published conflicting data. Based on data published in the 2015 UNHCR country operations profile – Ethiopia page, the number of Eritrean refugees who entered Ethiopia last year averaged about 2,500 a month. The Ethiopian Refugee Agency estimated 2,780 monthly arrivals of Eritreans in Ethiopia last year. These estimates show the 5,000 statistic from 2014 was more likely to be a spike rather than an accurate average.
(2) We don’t know exactly how the number was derived, and the most likely collection methods are fraught with limitations. If the number of people leaving Eritrea comes from data collected at refugee camps, many questions remain about how officials determine nationality and where the data may be compromised.
(3) If the number is based on self-reports, there’s a specific reason other nationals might claim to be Eritrean. Many European nations give prima facie status to Eritrean refugees. Whereas other migrants face prolonged reviews that may end in deportation, Eritreans receive automatic asylum-seeker status. Despite also facing hardships and oppression, Ethiopians, for example, do not receive preferential treatment in Europe. Ethiopia’s population is 15 times larger than Eritrea’s, but the East African neighbors share similar languages and cultures, making it difficult to ascertain their citizens’ true nationalities at first glance.
(4) The broader claim from the Commission of Inquiry Report that 5,000 Eritreans leave the country each month doesn’t match other U.N. figures. The U.N. estimates that about 2,500 Eritreans entered Ethiopia monthly in 2015, whereas about 1,000 entered Sudan. This leaves another 1,500 unaccounted for, if the 5,000 number is accurate.
(5) Some U.N. officials designated to investigate these numbers and the root causes of migration have de-emphasized their importance. Officials have underscored the importance of the lives affected, whatever the particular numbers might be. Asserting that it’s not about the numbers, these officials have so far been unable to provide clarification, casting further doubt on the accuracy of the information provided.
30,000 or 3,770?
This isn’t an isolated case. Last April, an overcrowded boat capsized on its way to Europe, claiming about 800 lives and leaving only 28 survivors. This tragic event happened on the heels of recurring shipwrecks but gained more attention from the international media. Shortly after the incident, a spokesperson for the International Organization for Migration (IOM), Joel Millman, told reporters about the news of the shipwreck and made an apparent estimate of how the number of deaths in the Mediterranean could reach 30,000:
“IOM now fears the 2014 total of 3,279 migrant [deaths] on the Mediterranean may be surpassed this year in a matter of weeks, and could well top 30,000 by the end of the year, based on the current death toll. It could actually be even higher.”
None of the reporters stopped to ask what accounted for such an increase. Instead, headlines such as “IOM: Mediterranean death toll could top 30,000 in 2015,” “30,000 migrants ‘may drown’ in 2015,” and “Mediterranean migrant deaths soar ‘exponentially’: IOM” spread across the Web. The jump from 3,000 to 30,000 within one year was never mathematically explained.
The 30,000 projection has since proven greatly inflated. By the end of 2015, 3,770 migrants had lost their lives crossing the Mediterranean, according to IOM.
It is not uncommon for statistics associated with reputable organizations to go unchecked. Exaggerated numbers, faulty statistics that persist for decades and erroneous, misattributed information are all too common. The issues at the heart of these numbers — corruption, economic inequality, and violence against women and children — are real and pressing. But that doesn’t make the statistics true.
Why skepticism matters with NGOs and nonprofits
Researchers and analysts with deep expertise staff nonprofits, NGOs and similar groups. They bring experience and insight to the reports they produce. But their data collection methods are imperfect. That’s important for journalists to remember, especially since these organizations are seldom primary sources. Instead, they conduct research and work with third parties to procure the data that informs their analyses.
The use of unconfirmed, anonymous sources is common, and data collection methods aren’t often disclosed beyond general principles. This makes it difficult to ascertain the quality of information produced and equally hard to crosscheck findings. In some cases, direct measurement isn’t possible. The commission tasked with investigating the scope of the Eritrean refugee crisis, for example, was not allowed into the country. If they had been, direct observation would have been difficult since the government limits the access of independent bodies. Add inevitable human error, and the need for scrutiny only magnifies.
It can be uncomfortable to dissect claims made by organizations dedicated to helping vulnerable people. When experts point to persecution and suffering, sympathy — not skepticism — is the natural response. But that’s precisely why journalists need to scrutinize the information they receive from nonprofits and NGOs. Good intentions don’t guarantee accuracy. It’s up to reporters and editors to sift through claims and find the truth.
Far from being objective observers, nonprofits, NGOs and international organizations have agendas. They focus on specific issues and push for policy change and civic action. Highlighting the critical problems they strive to address helps raise the funds needed to further their efforts. This doesn’t diminish the merit of their work, but it does heighten the need to vet the information they provide.
With media savvy liaisons, these groups are ready and willing to offer succinct, neatly-packaged information to reporters working on tight deadlines. That’s one more reason to be on guard when looking for key facts and quotes to round out a story.
Why numbers matter
Officials and critics who downplay the significance of statistics make a reasonable argument. Numbers, however shocking, mean little compared to people’s lives. Even one person who is tortured, persecuted or forced to flee his home is one too many.
Still, facts matter. Getting numbers right counts for something. Understanding the scope of a problem, seeing whether a situation improving or deteriorating, and making comparisons to surrounding areas all rely on precise, accurate numbers. With bad information, well-intentioned solutions may be ineffective or counterproductive.
Erroneous data can also empower those who seek to discredit anything an organization produces, undermining the credibility of both the NGO or nonprofit and the news organization.
What journalists can do
Faced with deadline pressures and sources with imperfect numbers, reporters face a difficult task. Still, steps can be taken to check the accuracy of information from NGOs and nonprofits.
(1) Do the math. See if the figure presented is plausible. Does it deviate greatly from previous trends? If so, is there a clear reason why? If the number sounds astronomical, dig deeper.
(2) Trace back to the original source of the number. Is it from a report, a press release, a press conference, an off-hand comment. Was the data obtained from a source on the ground? Was the source confirmed? Identified?
(3) Figure out who, exactly, is reporting the information. Large organizations often have many units operating under varying standards and practices. What one unit reports may not be consistent with the overall standards.
(4) Talk to reporters on the ground who have knowledge about the area. They might be able to corroborate what organizations are reporting. Also consider checking with other organizations to independently confirm data.
(5) Be wary of repeating information from other news organizations. A major news outlet reporting might give undue credibility to a statistic. Always check the original source.
(6) Be as specific as possible when attributing information. Attribute a specific workgroup or division rather than an entire organization.
(7) Ask organizations how they collected their data. What methods were used? Surveys? Direct observation? Were samples made? If so, how large were they, and how large is the total population?
(8) Be clear about what numbers are. Some statistics are estimates, but others are predictions or educated guesses based on available information. Strive to understand what the number is and how, specifically, it was derived. When possible, inquire into the likelihood that the number might be off and, if so, how far off it could be.
(9) Know when to not publish a number or be clear about how what’s not known. It’s okay to not repeat every number and to show where there is uncertainty.
Focusing on accuracy
Scrutinizing numbers isn’t about questioning the intentions of groups who advocate for vulnerable populations. It’s about seeking truth and fairness.
Journalists pride themselves on presenting facts. If they over-rely on information provided by NGOs, nonprofits and advocacy groups with a vested interest in raising awareness of critical problems, they risk compromising their credibility in the process.
Deeper skepticism and greater scrutiny of the data provided by these organizations can help journalists report more accurate stories.