Blogs

What Is the future of data sharing for research?


Authors: Giovanna Vilaza
Posted: Mon, June 08, 2020 - 9:30:55

Digital data collection for health research usually follows well-established methods. In many of the labs that work with mobile sensing, research subjects are provided with consent forms, task instructions, and sensor devices or apps. Once the research subjects agree to participate, the expectation is that they will comply with the procedures and allow their lives to be digitally tracked. After that, they are usually dismissed. 

Even though participants are such a vital part of scientific discoveries, they are often considered objects to be observed—one more entry in a database. Such well-established ways of placing those being monitored into passive roles have gained even more prominence. With the surge of Covid-19, there has been a noticeable increase in initiatives for health surveillance. From contact tracing to apps that monitor daily symptoms, the pervasiveness of smartphones is being exploited to collect data from large segments of the population. It is a blossoming field for those who work in this area, as the urgency to understand this illness is pushing mobile sensing in ways never before seen.

Given this sudden demand for broader behavioral monitoring, the debates over population-level surveillance have gone mainstream. In the particular case of contact tracing, academics are now discussing issues of individual privacy, the consequences of false positives (and negatives), and the actual efficacy of such an approach [1]. On the other hand, the media, governments, and tech companies are claiming that transmission speed may be reduced only if a significant part of the population is monitored continuously. Contact tracing has been enforced in countries in Asia and framed as a way to “help authorities identify virus hotspots and better target health efforts” [2]. 

By providing arguments that surveillance is the right path for recovery, governments and the media are forging a positive-only view of the subject. A consequence of the support for contact tracing and other symptom-tracking approaches could be a radical change in how people perceive privacy threats and accept being monitored for the “public good.” It could be speculated that efforts from the public sector and big corporations to convey the benefits of surveillance could lead the masses to believe this one-sided version of the story, without weighing its risks. Decisions about disclosure are known to involve a trade-off. If the perception of social or individual benefits is stronger than the identified possible risks, people are willing to share sensitive information [3].

If a shift toward more public acceptance of health surveillance indeed prevails, national-level repositories of the mobile-sensing data could also become very attractive to governments and scientists [4]. Large-scale platforms containing information such as clinical diagnoses, mobile-sensing data, and behavioral tracking data could allow incredible epidemiological discoveries. Before Covid-19, the landscape of such platforms was dominated by genetic bio-banks and clinical-trials repositories. Mobile-generated data was still a novelty. Nowadays, massive centralized data centers containing information about thousands (or millions) of individuals are growing around the world, such as All of Us in the U.S. and iCarbonX in China; they include digital sensors as a data source.

However, if a shift toward more acceptance does not prevail, large-scale surveillance will be at risk of low cohort diversity. First, to derive significant and fair conclusions from a dataset, a diverse range of people with different characteristics is required. Unless most of the population is tracked, the knowledge acquired will not be representative and may benefit only those who were available and agreed to be monitored. As Daniela and Nicole Rosner discuss on this blog, “prioritizing the most likely to be reachable tends to benefit well-educated white people who have already long benefited from the healthcare system.” What can HCI, UX, and technology design practitioners and academics offer to facilitate more inclusive recruitment for data platforms? What knowledge, tools, and evidence have we produced (or can we provide) that can be useful in this context? 

Besides inclusive recruitment, the search for public acceptance should not overwrite the need to consider possible impacts on all the segments of the population. Individuals from different backgrounds might have a different understanding of potential privacy risks, and people with stigmatized clinical diagnoses might suffer from the consequences of a data leak asymmetrically [5]. Broader acceptance should not result in less public diligence about privacy and how data can be abused. Such individual differences need to be taken into account because ill-intentioned initiatives may lure people with the promise of future advances in research, but come with a hidden agenda [6]. As Christopher Frauenberger states, “We might see the coronavirus serving as the scapegoat to implement modes of mass behavior manipulation by private companies.” How could HCI knowledge and approaches be used to support and protect citizens from these scenarios? Could HCI help overcome the uneven understanding of risks and help tackle vulnerabilities in case of privacy breaches?

As mentioned earlier, digital data collection for health research often follows well-established approaches. The pandemic has brought more attention to the subject of population surveillance, as seen in the reflections from Rosner and Frauenberger. However, the Covid-19 emergency has not changed the passive role attributed to those having their symptoms and contacts monitored. Most of the decisions about what data will be tracked, how it will be used, and who will have access to it are made from the top: by governments, health authorities, research institutions, and big corporations. When data repositories are built this way, power and knowledge are given to those who store the data, not to those who provide it [4]. This serves to strengthen the already existing inequalities between contributors and receivers. 

The most significant change that the pandemic should bring is not that surveillance becomes more broadly accepted. A real change would be to see those proposing surveillance platforms finally placing citizens at the core of their decisions, by listening to their concerns and providing them with direct protection and benefits. If people are to be asked to open up their lives for health surveillance or research, they should be respected, and their preferences prioritized. It is about time we put more efforts into understanding the needs from the different segments of the population and design for more inclusive participation and agency in research. The well-established approaches for data collection do not suffice anymore, as behavioral monitoring is being considered at a national level. Aggregated data might mean better healthcare now and in the future, but it is also a tool for power and mass control [6]. The path to reach acceptance should involve respect, transparency, and an ethic of involvement by communities from all backgrounds [7]. 

More than ever, those who are in public, academic, and industry positions hold the responsibility of taking into account any potential for harm that novel ideas can bring to each individual. This pandemic, or any other alarming situation in the future, should not mean that moral principles and personal autonomy are put aside. Large-scale digital surveillance for public health may gain momentum with contact tracing now. Still, we need to keep reflecting, discussing, and pushing for an ethical development in the field, through the papers we write, the products we build, and the ideas we share with others.

The pandemic has been a challenging time in many aspects, but it can also mark a moment when meaningful changes began. It forced many to stop, and some to reconsider how things have been done until now—and how different they could be. From this process, hopefully, a brighter future can emerge for data sharing, health surveillance, and research platforms alike—a future in which acceptance does not mean renouncement of rights and values, but rather a conscious choice based on terms and conditions that are negotiated and never imposed. This should become the new normal. The next advances in data-collection practices depend on us, researchers and designers in the HCI and health tech field, as we choose how we conduct our own projects and support those of our community.

Endnotes

1. Gillmor, D.K. Principles for technology-assisted contact-tracing. ACLU white paper. April 16, 2020. 

2. Phartiyal. S. India orders coronavirus tracing app for all workers. Reuters Technology News. May 2, 2020.

3. King, J. "Becoming part of something bigger" Direct to consumer genetic testing, privacy, and personal disclosure. Proc. of the ACM on Human-Computer Interaction 3, CSCW (2019), 1-33.

4. Kostkova, P., Brewer, H., de Lusignan, S., Fottrell, E., Goldacre, B., Hart, G., Koczan, P., Knight, P., Marsolier, C., McKendry, R.A., Ross, E., Sasse. A., Sullivan, R., Chaytor, S., Stevenson, O., Velho, R., and Tooke, J. Who owns the data? Open data for healthcare. Frontiers in public health 4 (2016), 7.

5. Petelka, R., Van Kleunen, L., Albright, L., Murnane, E., Voida, S., and Snyder, J. Being (in) visible: Privacy, rransparency, and disclosure in the self-management of bipolar disorder. Proc. of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, 2020, 1–14.

6. Zuboff, S. Big other: Surveillance capitalism and the prospects of an information civilization. Journal of Information Technology 30 (2015), 1.

7. Costanza-Chock, S. Design Justice: Community-led Practices to Build the Worlds We Need. MIT Press, 2020.



Posted in: Covid-19 on Mon, June 08, 2020 - 9:30:55

Giovanna Vilaza

Giovanna Vilaza is a TEAM early-stage researcher, halfway to her Ph.D. in the Department of Health Tech, Technical University of Denmark. Her current project is about a participant-centered future for behavioral monitoring in open-access data platforms. She is a University College London and KTH Royal Institute of Technology alumni. [email protected]
View All Giovanna Vilaza's Posts



Post Comment


No Comments Found