MMSys leads the way in research transparency and reproducibility. Submit your artifacts to earn ACM badges that enhance your paper's visibility and impact in the ACM Digital Library. Our Reproducibility Track evaluates your experimental materials against rigorous ACM standards.
Enhance your research impact with ACM's artifact reviewing and badging program
Personal web page or institutional repository
Functional and documented artifacts
Independent reproduction of results
MMSys was the first ACM SIGMM conference to implement the ACM's artifact reviewing and badging policy. Authors are invited to formally submit their supporting materials following the process below.
The reviewers for the Reproducibility Track will attempt to reproduce the experiments/simulations and assess if the submitted artifacts support the claims made in the paper through the clearly defined artifacts and instructions submitted with the paper including traces, source code, tools, original datasets, etc.
The Reproducibility Track awards badge(s) according to ACM's artifact reviewing and badging policy to the accepted papers that publicly release the artifacts used in the paper. The badge(s) appears as a highlight along with the paper in the ACM DL.
Send the required materials (see instructions below) to the Reproducibility Track chairs.
⚠️ Note that applying for a reproducibility badge is optional and does not influence the final decision regarding paper acceptance.
To make the reviewing task easier, we strongly recommend you to use a format for the artifacts that ensure easy reviewing and reproducibility. Note the following:
Either make a self-contained docker image or VirtualBox virtual machine (tips) or use one of the tools (Collective Knowledge, OCCAM, and Code Ocean) that allow direct integration of your artifacts into the ACM DL. They cover a wide range of cases and programming languages and are worth considering in most cases.
The code must be accessible and easily runnable, not presented as a black box.
The appendix (which has no effect on the page count of the camera-ready version) should be no longer than three pages, including all the guidelines for testing the artifacts. We recommend this template from ctuning.org, where you can also find a detailed description of what information to provide.
Important Note: If you have an unusual experimental set up that requires specific hardware or proprietary software, contact the Reproducibility Track Chairs before submission
We have prepared a comprehensive guideline in Google Colab for better illustration. This interactive guide will help you prepare your artifacts step-by-step.
Interactive step-by-step guide in Google Colab
Alternative Submission Method: If you don't want to send us a zipped file, you can use Google Colab to submit your artifacts. This provides a cloud-based environment that reviewers can access directly.
The artifacts go through a revision process, during which e-mail exchanges can occur between the authors and the Reproducibility Track chairs on behalf of the reviewers. The evaluators are asked to evaluate the artifacts based on the criteria defined by ACM.
Submit artifacts to Reproducibility Track chairs
E-mail exchanges with reviewers during revision
Receive recognition based on ACM criteria
Contact the Reproducibility Track chairs to begin the artifact review process and enhance the impact of your research.
Support MMSys 2026 and gain visibility in the multimedia systems community.
Learn More