🏆

Artifact Review & Badging

Enhance your research impact with ACM's artifact reviewing and badging program

📦

Artifacts Available

Personal web page or institutional repository

Artifacts Evaluated

Functional and documented artifacts

🔄

Results Reproduced

Independent reproduction of results

📋 Overview

MMSys was the first ACM SIGMM conference to implement the ACM's artifact reviewing and badging policy. Authors are invited to formally submit their supporting materials following the process below.

🔬 About Artifact Review

The reviewers for the Reproducibility Track will attempt to reproduce the experiments/simulations and assess if the submitted artifacts support the claims made in the paper through the clearly defined artifacts and instructions submitted with the paper including traces, source code, tools, original datasets, etc.

🎖️ Badges

The Reproducibility Track awards badge(s) according to ACM's artifact reviewing and badging policy to the accepted papers that publicly release the artifacts used in the paper. The badge(s) appears as a highlight along with the paper in the ACM DL.

📬

Submission Information

Send the required materials (see instructions below) to the Reproducibility Track chairs.

⚠️ Note that applying for a reproducibility badge is optional and does not influence the final decision regarding paper acceptance.

📝 Instructions and Required Materials

To make the reviewing task easier, we strongly recommend you to use a format for the artifacts that ensure easy reviewing and reproducibility. Note the following:

🐳

Either make a self-contained docker image or VirtualBox virtual machine (tips) or use one of the tools (Collective Knowledge, OCCAM, and Code Ocean) that allow direct integration of your artifacts into the ACM DL. They cover a wide range of cases and programming languages and are worth considering in most cases.

💻

The code must be accessible and easily runnable, not presented as a black box.

📄

The appendix (which has no effect on the page count of the camera-ready version) should be no longer than three pages, including all the guidelines for testing the artifacts. We recommend this template from ctuning.org, where you can also find a detailed description of what information to provide.

💡

Important Note: If you have an unusual experimental set up that requires specific hardware or proprietary software, contact the Reproducibility Track Chairs before submission

📘

Updated Submission Guideline

NEW

We have prepared a comprehensive guideline in Google Colab for better illustration. This interactive guide will help you prepare your artifacts step-by-step.

Artifact Submission Guideline

Interactive step-by-step guide in Google Colab

Open Guideline

Alternative Submission Method: If you don't want to send us a zipped file, you can use Google Colab to submit your artifacts. This provides a cloud-based environment that reviewers can access directly.

Getting Started

New to Google Colab? Start here.

Welcome to Colab →
📊

Example Paper 1

Complete artifact example

Open Example →
📈

Example Paper 2

Another artifact example

Open Example →

⚖️ Evaluation Process

The artifacts go through a revision process, during which e-mail exchanges can occur between the authors and the Reproducibility Track chairs on behalf of the reviewers. The evaluators are asked to evaluate the artifacts based on the criteria defined by ACM.

📥

Submission

Submit artifacts to Reproducibility Track chairs

🔄

Review

E-mail exchanges with reviewers during revision

🏅

Badge Award

Receive recognition based on ACM criteria

Ready to Submit Your Artifacts?

Contact the Reproducibility Track chairs to begin the artifact review process and enhance the impact of your research.

Sponsors

💎 Platinum

Become a Sponsor

Support MMSys 2026 and gain visibility in the multimedia systems community.

Learn More