The reviewers for the Reproducibility Track will attempt to reproduce the
experiments/simulations and assess if the submitted artifacts support the
claims made in the paper through the clearly defined artifacts and instructions
submitted with the paper including traces, source code, tools, original
datasets, etc.
The Reproducibility Track awards badge(s) according to ACM's artifact
reviewing and badging policy to the accepted papers that publicly release
the artifacts used in the paper. The badge(s) appears as a highlight along
with the paper in the ACM DL.
Note that applying for a reproducibility badge is optional and does not
influence the final decision regarding paper acceptance.
If you are interested in being awarded a badge, please follow the procedure detailed
below and send the required material to Jeroen van der Hooft (jeroen.vanderhooft [at] ugent.be)
and Nabajeet Barman (n.barman [at] ieee.org)
Instructions & Required Materials
To make the reviewing task easier, we strongly recommend you use a format
for the artifacts that ensure easy reviewing and reproducibility. Note the
following:
-
Either make a self-contained docker image or VirtualBox virtual machine
(tips)
or use one of the tools that allow direct integration of your artifacts
into the ACM DL. These tools (Collective Knowledge, OCCAM and Code Ocean)
are each described with short videos
here. They cover a wide range of cases and programming languages and are
worth considering in most cases.
- The code must be accessible and easily runnable, not presented as a black box.
-
The appendix (which has no effect on the page count of the camera-ready
version) should be no longer than three pages, including all the guidelines
for testing the artifacts. We recommend this template from ctuning.org,
where you can also find a detailed description of what information to provide.
If you have an unusual experimental setup that requires specific hardware or
proprietary software, please contact the Reproducibility Track Chairs before
submission.
Submission Policy
Submitting your work for a reproducibility badge happens in two phases:
Phase 1
The artifacts go through a revision process, during which e-mail exchanges
can occur between the authors and the Reproducibility Chairs on behalf of
the reviewers. Any potential problems or issues are reported to the authors,
potentially with requests for clarification. Authors can resolve any issues
ad hoc, maximizing their chances of getting accepted.
Final Deadline: 27 January 2025
Phase 2
The artifacts go through a revision process led by the Reproducibility Track
chairs and the reviewers. In case of problems or issues with the code,
reproducibility badges cannot be assigned.
Final Deadline: 21 February 2025
Difference Between Pahse 1 and Phase 2
Submitting for a reproducibility badge during Phase 1 maximizes your chances
of being accepted for a suitable reproducibility badge, as the Reproducibility
Chairs will get in touch with you to solve any issues early on. Phase 2 due
to a shorter review period will result in a stricter review and hence, any
issues with running the software/code might result in missing out on
reproducibility badge(s).
Evaluation Process
The artifacts go through a revision process, during which e-mail exchanges
can occur between the authors and the Reproducibility Track chairs on behalf
of the reviewers (if submitted by 27 January 2025). The evaluators are asked
to evaluate the artifacts based on the criteria defined by ACM.