Not logged in.
Quick Search - Contribution
Contribution Details
Type | Conference or Workshop Paper |
Scope | Discipline-based scholarship |
Published in Proceedings | Yes |
Title | Transparency of CHI Research Artifacts: Results of a Self-Reported Survey |
Organization Unit | |
Authors |
|
Presentation Type | paper |
Item Subtype | Original Work |
Refereed | Yes |
Status | Published in final form |
Language |
|
Event Title | Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems |
Event Type | conference |
Event Location | Honolulu, Hawai’i |
Event Start Date | April 25 - 2020 |
Event End Date | April 30 - 2020 |
Place of Publication | New York, NY, USA |
Publisher | ACM Digital Library |
Abstract Text | Several fields of science are experiencing a ""replication crisis"" that has negatively impacted their credibility. Assessing the validity of a contribution via replicability of its experimental evidence and reproducibility of its analyses requires access to relevant study materials, data, and code. Failing to share them limits the ability to scrutinize or build-upon the research, ultimately hindering scientific progress.Understanding how the diverse research artifacts in HCI impact sharing can help produce informed recommendations for individual researchers and policy-makers in HCI. Therefore, we surveyed authors of CHI 2018-2019 papers, asking if they share their papers' research materials and data, how they share them, and why they do not. The results (34% response rate) show that sharing is uncommon, partly due to misunderstandings about the purpose of sharing and reliable hosting. We conclude with recommendations for fostering open research practices.This paper and all data and materials are freely available at https://osf.io/3bu6t. |
Free access at | Related URL |
Related URLs |
|
Digital Object Identifier | 10.1145/3313831.3376448 |
Other Identification Number | merlin-id:20081 |
PDF File | Download from ZORA |
Export |
BibTeX
EP3 XML (ZORA) |