The Project
Being able to duplicate published research results is an important process of conducting research whether to build upon these findings or to compare with them. This process is called “replicability” when using the original authors' artifacts (e.g., code), or “reproducibility” otherwise (e.g., re-implementing algorithms). Reproducibility and replicability of research results have gained a lot of interest recently with assessment studies being led in various fields, and they are often seen as a trigger for better result diffusion and transparency. In this project, we assess replicability in Computer Graphics, by evaluating whether the code is available and whether it works properly. As a proxy for this field we compiled, ran and analyzed 192 codes out of 454 papers from SIGGRAPH conferences (exhaustive for 2014, 2016 and 2018). In the analysis described in
- Nicolas Bonneel, David Coeurjolly, Julie Digne, Nicolas Mellado, Code Replicability in Computer Graphics , ACM Trans. on Graphics (Proceedings of SIGGRAPH 2020), 39:4,
we show a clear increase in the number of papers with available and operational research codes with a dependency on the subfields, and exhibit a correlation between code replicability and citation count.
This website provides an interactive tool to explore our results and evaluation data. It also provides tools to comment on the various codes either as an author or as a user. All materials (data, scripts..) that have been used to generate these results are available on the replicability.graphics GitHub project. The website contains the data for all papers in:
and partial data for:
As a long term goal, we would like to collect data for more SIGGRAPH venues, for SIGGRAPH ASIA editions, for ToG papers, and for any other computer graphics events. If you want to help, see contributions, and check out the contributors.
Our project aims at providing the community with tools to improve Computer Graphics research replicability. While the Graphics Replicability Stamp Initiative (GRSI) encourages authors to make their research replicable, in our project we check whether existing research is replicable.
You can contribute new code analysis for computer graphics papers. We're looking forward to your contributions. You can also contact us.
Data Digest
Key numbers
Number of review / year of publication
Replicability results for reviewed papers
PDF accessibility for reviewed papers
Explore
Explore the data and our replicability scores
Analyze
Read our Siggraph 2020 paper on 374 analyzed 2014-2016-2018 Siggraph papers.
Contribute
Add comments or new analyses for Computer Graphics papers.