Black Box or Open Science? Assessing Reproducibility-Related Documentation in AI Research

peter.suber's bookmarks 2024-01-02


Abstract:  The surge in Artificial Intelligence (AI) research has spurred significant breakthroughs across various fields. However, AI is known for its Black Box character and reproducing AI outcomes challenging. Open Science, emphasizing transparency, reproducibility, and accessibility, is crucial in this context, ensuring research validity and facilitating practical AI adoption. We propose a framework to assess the quality of AI documentation and assess 51 papers. We conclude that despite guidelines, many AI papers fall short on reproducibility due to insufficient documentation. It is crucial to provide comprehensive details on training data, source code, and AI models, and for reviewers and editors to strictly enforce reproducibility guidelines. A dearth of detailed methods or inaccessible source code and models can raise questions about the authenticity of certain AI innovations, potentially impeding their scientific value and their adoption. Although our sample size inhibits broad generalization, it nonetheless offers key insights on enhancing AI research reproducibility.



From feeds:

Open Access Tracking Project (OATP) » peter.suber's bookmarks

Tags: oa.open_science oa.repositories oa.code oa.reproducibility

Date tagged:

01/02/2024, 12:59

Date published:

01/02/2024, 07:59