Peer Reviewing Our Data Stories

Print More
76cceaee387a438274f90cc3181f01c7

SRCCON peers, reviewing. Photo by Erik Westra.

As journalists who analyze data for stories, we strive to hold ourselves accountable to a high standard of accuracy. But checking our work is rarely a straightforward process. Newsroom editors and fact-checkers might not have enough data expertise. Often, we need an outside opinion. Ideally, we could ask each other for advice, or even turn to experts in other fields for help. In academia, asking for outside comment before publication is broadly referred to as “peer review.”

srcconDuring SRCCON, we discussed the challenges of applying the concept of peer review to journalism and some ways we’ve make it work for us.

(We are defining peer review here as the practice of having colleagues in a given academic field scrutinize each others’ methods and findings. A more formal definition, which we are not using to here, refers to a committee of referees evaluating a research article before deciding whether or not to publish it in a scholarly journal.)

Here are a few insights that arose from our session.

Basics of Journalistic Peer Review

What does peer review look like, in this context?

Journalistic peer review should be a two-step process: first, vetting your analysis pre-publication and second, posting the data and methods along with the story.

We made an important distinction between verifying your work before running the story and showing your work in tandem with or after publication of the story. While we focused more on pre-publication review, we agreed that the open-sourcing of data analysis, which does not happen enough in academia, is an area where data journalists could take the lead.

What should be peer reviewed?

Stories can include a range of data findings—from a few numbers to supporting charts to in-depth investigations. When considering peer review, you should think about whether your findings warrant outside opinion, or if your numbers could just use a simple double-check. In our session, we decided that it’s most useful to evaluate how much the basis of your story depends on the data analysis. If the entire story hinges on conclusions that you found through an analysis, then it’s worth comparing notes with someone else. This also might include stories that predict or interpolate a finding.

What does peer review look like inside of a newsroom?

There are many ways to checking your numbers internally. It can be as simple as talking to a colleague who understands a bit of the methodology behind the analysis. A stronger approach would be to pair up with a team member before any actual analysis is done: the two of you would independently decide on the appropriate methodology for the given data, then reconvene and look at the strengths and shortcomings of the different approaches. Perhaps if you’re part of a team of other data journalists, you could meet once a week to spot check each other’s work or nerd out about quantitative methodologies.

What about outside of a newsroom?

Sometimes, you don’t have anybody else to swap notes with. Even if you are on a data team, you might still need somebody with more expertise than your co-workers can offer.

screen-shot-2016-10-21-at-10-27-46-amIn these cases, consider reaching out to other peers who may have struggled with the same decisions or worked with the same dataset. Even if those journalists don’t provide an opinion on your methodology, they can likely point you to someone else who would. They might not see you as “the competition”, since often they have a completely different focus and audience for that particular subject (campaign finance came up as a timely example). If you don’t know where to start reaching out to people, try various topic-specific channels on the NewsNerdery Slack channel or the NICAR-L listserv.

If you’re looking for math and stats experts, you could consider stats.org, a group of statisticians that make themselves available to help journalists interpret findings and discuss computational approaches. Other subject-matter specialists like researchers or professors are another resource for vetting general findings.

Trickier Situations

If your story is sensitive, consider sharing a description of your methodology, without revealing specific findings. That way, people will comment on your analysis, not your conclusions.

One key difference between journalists and academics is the extent to which they share their results pre-publication. Academics are generally encouraged to share and solicit comments on their work at all stages during the publication process. Journalists, on the other hand, are less inclined to discuss what they are working on outside of the newsroom until the story runs, especially if the findings are controversial.

If you’re working on a story that’s expected to generate a lot of debate, one solution is to prepare a methodology piece that anticipates and addresses possible critiques but does not mention specific results. This is what some newsrooms call a “white paper.” Once you have that, you could consider circulating it to a small group for comments.

screen-shot-2016-10-20-at-6-40-40-pmProPublica did this for its Surgeon Scorecard project, which rates surgeons based on their complication rates. Before publishing the database, they shared a detailed white paper with scholars and patient safety experts.

This “methodology before results” strategy could work on a smaller scale, too. For example, talking to somebody at a federal statistical agency about the questions you’re looking to answer using their data can be extremely helpful, even if you do not mention the specific answers you’ve theorized.

Transparency After Publication

Make it easy for people to peer review even after publication, by being transparent about your work.

Opening up your data and methods to public scrutiny is another way of holding yourself accountable. Even a one-sentence summary where you got the data from and what you did with it is better than nothing at all.

Being transparent is almost always easier said than done. Christine is looking into why this is the case: get in touch with her at ychristinezhang@gmail.com or fill out her survey if you’re interested in being a part of her research.

So Much More

As much as we covered, there are plenty of topics we didn’t have time to discuss, like best practices for releasing your data (some ideas here on Source and elsewhere) and ways to incentivize journalists to do more peer review. If you have any thoughts, please reach out to us @christinezhang and @ArianaNGiorgi.


This story originally appeared on the website of Source and is reprinted with permission.

 arianaDallas-based Ariana Giorgi is a computational journalist at The Dallas Morning News,  focusing on data analysis and visualization, and is a team member at @DMNinteractives. 


christineChristine Zhang
 is a Knight-Mozilla OpenNews Fellow with the Los Angeles Times Data Desk. Previously, she was an analyst with the Brookings Institution in Washington, D.C.

Leave a Reply

Your email address will not be published. Required fields are marked *