The ICML 2019 Code-at-Submit-Time Experiment | by Kamalika Chaudhuri | Medium

peter.suber's bookmarks 2021-08-26

Summary:

"For the 36th International Conference of Machine Learning (ICML 2019), we decided to explore a new measure to incentivize code release — encouragement for voluntary code submission at the time of manuscript submission. Our goal was to promote a culture change by encouraging the community to submit code. We thought this is best done at submission time — as opposed to publication time — as this gives the program committee a chance to inspect and evaluate the code during the review process at their discretion. This is much like the option given to theoretical papers to submit an appendix with full mathematical proofs that can be checked by the program committee if needed....

How did the experiment work? We are delighted to report that a great deal of code was submitted. By our calculation, about 36% of more than 3000 submitted manuscripts included code with their paper submission. Additionally, 67% of the 774 accepted papers provided code at camera-ready time. Contrast this with NeurIPS 2018, where just below half of the accepted papers had code available with the camera-ready.

Who submitted code? The short answer is authors from all over the world — both academics and industry researchers. 27.4% of papers that included code in their submission had an author from industry, and 90.3% of papers with code in their submission had an author from academia. Contrast this with the total number of submissions — 83.8% of the total number of submissions had an author from academia while 27.4% had an author from industry...."

Link:

https://medium.com/@kamalika_19878/the-icml-2019-code-at-submit-time-experiment-f73872c23c55

Updated:

08/26/2021, 12:33

From feeds:

Open Access Tracking Project (OATP) » peter.suber's bookmarks

Tags:

oa.code oa.floss oa.incentives oa.reproducibility oa.ai

Date tagged:

08/26/2021, 16:33

Date published:

06/07/2019, 12:33