Before making major changes, please engage in the discussion on Issue #3


What are Reproducible Research & Open Science Badges?

These are small signals that recognise your efforts to improve the reproducibility and openness of your research. Inspiration from these comes from the Open Science Framework Badges to Acknowledge Open Practices Project. The goals of these badges are to:

How do I get a Reproducible Research & Open Science Badge?

Badges will be awarded to individual researchers who self-certify by listing URLs of their research products as evidence of their eligibility for a badge.

Submit your evidence for each item by completing this survey: http://goo.gl/forms/jm08DOJ2EI (results of the survey are here)

Use the points scale below to tally up your score. Your total score must include at least one point from each of the three categories.


Reproducible Research & Open Science Badge Criteria

Criterion score
Category One - Reproducibility
Publication has complementing code & data repositories that can be used to reproduce results 4 pts each
Code repositories have detailed, plain-English README files describing the computational environment and dependencies (including version numbers) to facilitate independent reproduction 1 pt each
Computational environment captured (ie. in a virtual machine image/dockerfile/vagrantfile, etc.) in one or more research compendia 1 pt
Primary software tools for research are open source 1 pt
Submit your code for review 2 pts each
Perform code review for someone else 2 pts each
Have the substantive results of your publication reproduced by someone else using your code and data 2 pts each
Perform a reproducibility review for someone else 2 pts each
Category Two - Open Data and Code
Code is developed in one or more GitHub/BitBucket/GitLab or similar version-controlled open code repository 1 pt
Code has one or more tests 1 pt
One or more code repository has an open license (eg. MIT/BSD/GPL2 etc.) 1 pt
One or more code repository has a persistent URL (eg. DOI) to a specific release 1 pt
One or more datasets in an institutional repository (not personal webpage) 1 pt
One or more datasets in Figshare/Zenodo or domain specific open data repository that issues persistent URLs 1 pt
One or more sets of raw data are published as soon as generated 5 pts
One or more sets of raw data are in open formats (eg. CSV, txt, etc.) 1 pt
One or more Online Lab Notebooks 2 pts
An account at one or more web-based code-hosting service (eg. GitHub/BitBucket/GitLab or similar) 1 pt
Category Three - Open Access Products
One or more grant proposals & data management plans online & Open Access 1 pt
Preprint in arXiv, bioarXiv, PeerJ Preprint 1 pt each
Green Open Access copies of publications in a discipline or institutional repository 1 pt each
>75% of papers in last two years are Open Access 5 pts
>50% of papers in last two years are Open Access 2 pts
Publons and/or PubPeer account 1 pt
An ORCID ID 1 pt

Now add up your score and find out what level you’re at:

Level Criteria
Gold 50 points
Silver 25 points
Bronze 10 points

Where to find more information on these topics

If you only read one paper on reproducible research and computational methods, it probably should be this one:

Stodden, V and Miguez, S 2014. Best Practices for Computational Science: Software Infrastructure and Environments for Reproducible and Extensible Research. Journal of Open Research Software 2(1):e21, DOI: http://dx.doi.org/10.5334/jors.ay

If you want to read more, the UW Reproducibility and Open Science group have collected some resources to help people get started with making their research more reproducible and open:

There are a few notes on more specific topics (contributions to these lists are most welcome!)

There are also many other helpful resources elsewhere:

rOpenSci’s Reproducibility in Science: A Guide to enhancing reproducibility in scientific results and writing & Further reading

PDF library for Victoria Stodden’s Reproducible Research Group on Zotero

PDF library for the Berkeley Institute for Data Science (BIDS) reproducibility working group on Zotero