Blogs

Privacy and ethics concerns using UX research platforms


Authors: Michal Luria
Posted: Wed, July 26, 2023 - 3:03:00

Remote user experience (UX) research is not a new phenomenon [1], but the pandemic has accelerated it, necessitating the adaptation of traditionally face-to-face methods to work remotely [2]. The past few years marked a growth of online platforms, such as dscout, Lookback, Metricwire, and UserTesting, to name just a few. Remote UX research platforms are marketed primarily as tools for industry UX researchers, but are also used occasionally by academics.

These user-friendly interfaces promise remote UX researchers everything they need to conduct a successful study. They support many methods, and provide tools for all research stages, from recruitment via participant pools to built-in analysis. But the platforms’ quick-and-easy approach also creates ethical issues when they don’t fully consider researchers’ obligations toward their participants [3].

UX research frequently involves people, and therefore risks causing harm [4]. For example, people may be asked to share personal information about sensitive topics as part of research, like their sexual orientation or their use of adult websites. Without the right safeguards, this information could get into the wrong hands. Or participants may be asked to view social media content and subsequently be exposed, perhaps unintentionally, to harmful content. These scenarios are even more delicate if a study involves minors or other vulnerable groups [5].

In corresponding with several UX research platforms, I was surprised to discover consistent ethical and privacy-related gaps in the platforms’ ability to address possible risks in human subjects research, particularly in their consent, data ownership, and safety protocols. Here, I outline how these gaps could be addressed for better research practices. 

Consent should be mandatory. When using a service’s participant pool, researchers can, but are not obligated to, include a consent process. Platforms say that, when a participant signs up for their platform, they agree to their terms and conditions, suggesting that this agreement covers participation in any research study without further consent.

This is not compliant with ethical research standards and the requirement for participants to be informed: Participants should be able to explicitly consent to participation in every study, after having been fully informed about its goals, risks, and outcomes [6].

Not only is obtaining consent optional on some UX research platforms, but collecting signatures often requires a premium subscription. The alternative for non-premium customers would be to put the consent information into a survey question and ask participants to click “Agree.” For the case of parental consent for minors as required by laws like the General Data Protection Regulation (GDPR), one platform suggested simply including another survey question that asks the minor whether their parents agreed to their participation.

Data should be owned by researchers, not the platform. Normally, personal data for research would be collected for “legitimate research purpose only” [7]. But many UX research platforms claim ownership of the data in their terms of service, allowing them to use it in any way they see fit. Many platforms explain that they use this participant data to improve their services, but nothing prevents them from sharing or selling it without participants’ knowledge.

In my correspondence with one such platform, I inquired about the ability to delete data at the completion of a study. The platform agreed to give me special permissions to request data deletion on behalf of my participants. This would at least limit platforms’ access to participants’ data only to the duration of the study. However, many platforms don’t allow researchers to download data, meaning that data deletion would remove access for the researchers too.

Researchers should mitigate adverse events. Researchers must plan for the possibility of unintended harm to participants’ safety and well-being [8]. In a forthcoming study for which I considered using a remote UX research platform, there was a rare possibility that participants would experience emotional distress. To account for this scenario, our research team included a certified clinician who could reach out to participants, assess suspected harm, and fulfill any obligation to escalate and report.

Platforms usually withhold participants’ contact information to spare researchers from additional exposure to identifiable data. In a case of suspected harm to a participant, however, my expectation was for platforms to collaborate to best support participants’ well-being. Instead, the platform stated that in such an event, I would report it to them and they would handle it internally.

This is unacceptable. As a researcher, the consent agreement is between me and the participant. I am responsible for any safety risks my research poses, including cases that require special attention. No matter how well prepared the platform is, they are not the ones who should be making the call on how to address concerning responses to research questions.

Concluding thoughts. The current state makes it hard for university researchers to use these remote UX research platforms. Universities have detailed training and research protocols for anyone involved in human subjects research, and many of them would not approve the terms of current platforms. By contrast, many companies and organizations do not have strong ethics review processes for research conducted by employees or contractors, and they commonly use remote UX research platforms. Even though many industry researchers have gone through academic ethical training, these platforms provide an easily overlooked path toward conducting human subjects studies with few safeguards and considerations about how data is collected, stored, and used.

This may be because many of these platforms were designed with the researcher’s user experience as a first priority, leaving participants’ well-being as an afterthought. The result is that participants are exposed to unnecessary risk of harm. While in most instances this would go unnoticed, in extreme cases things could go very wrong.

But not all UX research platforms are created equally—there are a few platforms that prioritize privacy and participant safety. They are not as user friendly and usually do not include participant recruitment. But as recruitment in these safer alternatives is under the researchers’ control, so is the data, the consent process, and the handling of adverse events. We should be able to have the best of both worlds—the capabilities and convenience we desire along with the privacy and safety practices we need.

Endnotes

1. Moon, Y. The effects of distance in local versus remote human-computer interaction. Proc. of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, 1998.

2. Süner-Pla-Cerdà, S. et al. Examining the impact of Covid-19 pandemic on UX research practice through UX blogs. In Design, User Experience, and Usability: UX Research and Design. Springer, Cham, 2021.

3. Fiesler, C. et al. Exploring ethics and obligations for studying digital communities. Proc. of the 2016 ACM Conference on Supporting Group Work. ACM, New York, 2016.

4. National Institutes of Health. Human subjects research; https://grants.nih.gov/policy/...

5. Antle, A.N. The ethics of doing research with vulnerable populations. Interactions 24, 6 (2017), 74–77.

6. Vitak, J. et al. Beyond the Belmont principles: Ethical challenges, practices, and beliefs in the online data research community. Proc. of the 19th ACM Conference on Computer-Supported Cooperative Work. ACM, New York, 2016.

7. Alsmadi, S. Marketing research ethics: Researcher’s obligations toward human subjects. Journal of Academic Ethics 6 (2008), 153–160.

8. Do, K. et al. “That’s important, but…”: How computer science researchers anticipate unintended consequences of their research innovations. Proc. of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, 2023.


Posted in: on Wed, July 26, 2023 - 3:03:00

Michal Luria

Michal Luria is a researcher at the Center for Democracy & Technology. Her work makes use of immersive and human-centered design research methods to envision and critique interactions with technology. Using these methods, she translates research insights into thought-provoking interactions and necessary discussions of technology ethics and policy. [email protected]
View All Michal Luria's Posts



Post Comment


@Angel17 (2024 05 08)

What an informative post. I enjoy reading it! brick stone masonry

@smash karts (2024 08 13)

I read your article and found it very interesting. All the information is very useful.