We are very pleased that the 24th ACM SIGPLAN Annual Symposium on Principles and Practice of Parallel Programming (PPoPP’19) has set a new record with artifacts evaluated for 20 out of the 29 accepted papers or ~70%, with a dramatic increase from ~30% for PPoPP’15. You can find the list of evaluated artifacts together with ACM reproducibility badges at cTuning.org/ae/artifacts.html#ppopp2019.
We would like to thank the Artifact Evaluation Committee of 35 researchers and engineers for their tremendous effort to evaluate very diverse artifacts within 2 weeks following the standard ACM Artifact Review and Badging guidelines ! They did a great job communicating with the authors via HotCRP to resolve many technical issues together and enable successful evaluation of all the artifacts.
We are also very grateful to Maria Grazia Giuffreda and Swiss National Supercomputing Centre (CSCS) for providing access to the Piz Daint supercomputer needed to evaluate several artifacts. We also thank Texas Advanced Computing Center (TACC) for providing access to Stampede2.
We are also very grateful to all the authors and reviewers for providing very constructive feedback via a shared document . We plan to gradually address issues and concerns to continue improving future AE. For example, considering that we now have to deal with so many submitted artifacts, we started working on open-source tools to automate preparation of experiments, validation of results, assignment of bages and reusability of all shared research components (code, data sets, models, scripts). We need to help authors dramatically reduce their time and effort when sharing complex research workflows while enabling evaluators to automatically validate results in a standard and portable way. You can find examples of such automation here. Our additional resolution for 2019 is to introduce AE at computer architecture conferences.
We thank Michel Steuwer for helping us configure HotCRP (a popular platform for managing the conference review process) for AE. We use HotCRP because it enables easy and convenient communication between the authors, evaluators and chairs to fix issues during artifact evaluation and validation of experimental results.
However, the minor downside of HotCRP is that it is quite tricky to configure it for AE and there is no support to export and import configurations. That is why we have decided to share typical AE configuration from PPoPP’19 — you can just copy/paste different fields to configure your own HotCRP AE instance:
- Settings — Basics
- Settings — Messages
- Settings — Submissions
- Settings — Submission form
- Settings — Reviews
- Settings — Review form
- Settings — Decisions
We have also prepared a CK module to automatically generate a list of artifacts with their authors and all the badges needed by the ACM. You need to select all artifacts on the main HotCRP page, select “Download -> Paper information — JSON”, and then download a file such as “conference-data.json”. You can then process it as follows:
$ git clone https://github.com/ctuning/ck
$ ck/bin/ck pull repo:ck-artifact-evaluation
$ ck/bin/ck process_hotcrp ae — json_file=conference-data.json
CK will create two files “tmp-ae.html” and “tmp-ae.txt” in your current directory with a list of artifacts in the HTML and text formats.
Happy artifact evaluation!