Authors: Florian Echtler, Lonni Besançon , Jan Vornhagen, Chat Wacharamanotham
Posted: Fri, August 25, 2023 - 11:15:00
During the past few years, the Covid-19 pandemic has resulted in an unprecedented amount of research being conducted and published in a very short timeframe to successfully analyze SARS-COV-2, its vaccines, and its treatments . Concurrently, the pandemic also highlighted the limitations of our publication system, which enables and incentivizes rapid dissemination and questionable research practices .
While HCI research is usually not a foundation for life-and-death decisions, we face similar problems. HCI researchers and CHI community members have long criticized a lack of methodological  and statistical rigor  and a lack of transparent research practices in quantitative  and qualitative works . Research transparency can alleviate these issues, as it facilitates the independent verification, reproduction, and—wherever appropriate—replication of claims. Consequently, we argue that the CHI community needs to move toward a consensus on research transparency.
Reaching this consensus is no trivial task, as HCI is an inter- or transdisciplinary field of research, employing myriad methods, and therefore cannot be subjected to a single, rigid rule set. For example, qualitative and quantitative data cannot be shared to the same degree without potentially identifying participants. Although the context of each research project might constrain feasible transparency practices, we believe that the overarching principle of research transparency can be applied to fit the vast variety of research that exists within our field.
Despite the benefits of transparent research practices, many of our efforts for more transparency within SIGCHI research have been met with considerable pushback. Junior researchers are worried about having their workload increase even more through additional documentation requirements, while more senior members of the community—who have built their careers on a research model that did not yet focus on transparency—may fail to see noticeable value in the additional effort required. Qualitatively oriented researchers, both within  and outside HCI , do not see themselves represented in a discussion that often focuses on statistics, data collection, and related topics. Meanwhile, some quantitative researchers mistook that preregistration limits potential exploratory analysis .
Ideally, any publicly available record of scholarship—whether it is a research paper, an essay, a dataset, or a piece of software—should be adequately transparent to enable the public to assess its quality and for subsequent research to build upon.
After all, the goal of research is not to pad one’s own h-index or to get a p-value below 0.05, but rather to increase the total sum of human knowledge. And in order to do so, we need to be able to build on the research that other scholars have done before us, so we can see farther. For this reason, transparency is indeed the one fundamental principle behind the open science movement, and it is so general that any of the subcommunities within HCI should be able to follow it.
The slow but steady move toward more transparency across all disciplines of research also provides a unique opportunity for the HCI community: At least some of the reluctance to adopt this approach stems from a lack of supporting tools. It is our field’s core expertise to uncover and analyze problems, and iteratively develop and evaluate potential solutions. Imagine that writing your statistical analysis would immediately update the figures in your paper and provide an automatic appendix with all calculations. Or that the coding process for interview quotes would keep a record of the shifting themes until the final result is reached, complete with an accurate log of the decisions that led to this point. These kinds of tools could be valuable far beyond HCI, for example, in related disciplines such as psychology and sociology, helping to establish a more transparent process there as well. Besides, the interdisciplinary nature of HCI allows us to test and debate innovations in methods, tools, and policies. We believe that this richness in perspectives and the technological and design capacity in our field could contribute to unblock open science conundrums, for example, challenges in sharing qualitative data .
For the future, we hope for guidelines that emphasize the values and opportunities within transparent research. We hope for reviewers who take them to heart, and include the transparency of a manuscript in their assessment, not as just superficial novelty. We hope for established researchers to lead by example, providing transparent insights into their research process, and for hiring committees and examination boards to value these contributions on their own. Last but not least, we hope for all members of the community to consider how they can increase transparency in their research and publication practices.
2. Besançon, L., Bik, E., Heathers, J., and Meyerowitz-Katz, G. Correction of scientific literature: Too little, too late!. PLOS Biology 20, 3 (2022), e3001572; https://doi.org/10.1371/journa...
3. Greenberg, S. and Thimbleby, H. The weak science of human-computer interaction. (Dec. 1991); https://doi.org/10.11575/PRISM/30792
4. Cockburn, A. et al. HARK no more: On the preregistration of CHI experiments. Proc of the CHI Conference on Human Factors in Computing Systems. ACM, New York, 2018, 1–12; https://doi.org/10.1145/3173574.3173715
5. Vornhagen, J.B. et al. Statistical Significance Testing at CHI PLAY: Challenges and Opportunities for More Transparency. Proc. of the Annual Symposium on Computer-Human Interaction in Play. ACM, New York, 2020, 4–18.
6. Talkad Sukumar, P., Avellino, I., Remy, C., DeVito, M.A., Dillahunt, T.R., McGrenere, J. and Wilson, M.L. Transparency in qualitative research: Increasing fairness in the CHI review rrocess. Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, 2020, 1–6.
7. Kapiszewski, D. and Karcher, S. Transparency in practice in qualitative research. PS: Political Science & Politics 54, 2 (2021), 285–291; https://doi.org/10.1017/S1049096520000955
8. Wacharamanotham, C., Eisenring, L., Haroz, S. and Echtler, F. Transparency of CHI research artifacts: Results of a self-reported survey. Proc. of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, 2020; https://dl.acm.org/doi/10.1145...
Posted in: on Fri, August 25, 2023 - 11:15:00
View All Florian Echtler's Posts
View All Lonni Besançon 's Posts
View All Jan Vornhagen's Posts
View All Chat Wacharamanotham's Posts