Features

XXXI.5 September - October 2024
Page: 24
Digital Citation

Mosaics of Personal Data: Digital Privacy During Times of Change


Authors:
Wendy Moncur

back to top 

All of us go through times of change. As some of these changes can be personally significant and socially sensitive, we should have choices about what details, if any, related to a change are shared with others. Sometimes, we may need to distance ourselves from, or even sever ties with, people to whom we were previously connected. But privacy can be elusive, as digital technologies and the Internet pervade almost all aspects of our lives. While understanding our own needs to balance privacy and connection is difficult, navigating available options for sharing is often even harder. Most of us are deeply entangled in digital infrastructures, our data collected, assembled, and at risk of being leaked. Our information can be shared with a much larger audience and for much longer than we intended—long after the information may even be relevant. As a result, past ties may be difficult to truly sever.

back to top  Insights

Fragments of our personal data are scattered across multiple platforms and across time by ourselves and others.
Linking these fragments can produce a mosaic that affords significant and unintended insights to others—both human and AI.
We need usable, privacy-enhancing tools that assemble and reflect these mosaics back to individuals, enabling them to understand and curate what they reveal about themselves.

In this article, I'll look at what personal data is, who shares it, and why it's so tricky to control once shared. Then I'll provide examples of how significant life changes can create greater demands on us around how we navigate and negotiate the privacy, or at least controlled sharing, of our data. I'll share some tips about what questions to ask and how to work through some scenarios, and hopefully help guide your decision making.

back to top  Personal Data

As Brian Parkinson and colleagues write, personal data is "the digital data that is descriptive of an individual," the "mosaic…digitally extended self" [1]. It involves a person's real name, pseudonyms, and any data linked to their identity. It also refers to anonymous content that does not intentionally reveal an individual's identity but does so incidentally. Personal data encompasses seemingly mundane information such as name, address, date of birth, photos, social media posts, and location. It can also include sensitive information, such as sexual identity, sexual preferences, and details about current or past sexual activity; political and religious views; health information, as well as genetic and biometric data such as fingerprints and facial recognition data; personal interests and affiliations such as trade union membership; information about cultural and social identity; and financial information such as company directorships, employment, and contributions to charities. Such information can potentially be recorded across many platforms and for long periods of time, and thus be available for subsequent retrieval, analysis, and reposting.

ins01.gif

The mosaic of personal data isn't just created and shared by the person themselves; it is co-constructed. Many of us will be tagged in social media posts by friends and family. We will be on voter registration lists and perhaps company directorships, held and made visible by government agencies. Our profiles may appear on our employers' websites. A bystander at a public event may include us in their footage of the event and post it online—inadvertently revealing our location, whom we are with, and what we are doing. Hidden and embedded trackers within online services record interactions with people. The ways in which the information gathered by trackers is subsequently used are opaque to the user and may be unwelcome. An example of this is the U.K. Metropolitan Police's 2023 privacy breach [2]. The department accidentally shared reports of sexual offenses and domestic abuse, including the victims' identities, to Facebook via the Meta Pixel tracking tool embedded in the police department's crime-reporting website.

Individual fragments of personal data shared online may seem to have minimal risk associated with them. Linking these fragments across platforms and across time, however, can produce a mosaic that affords significant and unintended insights into an individual's habits across their work and private lives; their personality, emotions, and mental health; as well as whom they hang out with and where [3]. In extremis, such insights can affect civil liberties. This is evident in the digital authoritarianism exerted by the Chinese Communist Party, whereby AI-powered surveillance is used to enforce citizens' "acceptable" behavior via the compilation of social credit scores grounded in extensive personal data [4]. Citizens with "bad" scores can be excluded from state-sponsored benefits, such as permission to travel by air or rail. Beyond individuals, a bigger picture invites concern in national security contexts. Adversaries with malign intent can use mosaics of personal data to deduce strategic vulnerabilities with political, national, and global dimensions [5].

Personal data is persistent. It is easy to replicate, reproduce, edit, and manipulate. And it is hard to get rid of once it's online or recorded digitally in one or more databases. Deletion is either not supported or is practically impossible because of technical and logistical roadblocks, or the data is so deeply entangled with other data that it cannot be extracted. For example, think of all those great photos of you with your best friend and their pesky ex.

Personal data is also highly accessible via search engines, which are designed to find information and make it accessible to all. One of the problems here is that information that is incorrect, outdated, or irrelevant can still be found and shared with unknown others. In the new age of generative AI, it has been clearly demonstrated that AI "hallucinations" are being presented as truth. Presented with authority, these falsehoods may be generated by AI based on incorrect personal information or inaccurate composites. The key point here is that it is almost impossible to comprehensively audit, correct, curate, or delete our personal data once it has appeared online. The most optimistic outcome many hope for is that, like leaves in a pond, our personal data will gradually drift into murky obscurity, failing to surface on the first few pages of a search engine over the years or to be readily dug up by information-collation engines managed and propelled by large language models.

back to top  Times of Change

We have been researching how the issues outlined above affect everyday people and how technologies do and do not align with what they need. Our explicit interest lies in times of change. For more than a decade, we have carried out privacy studies focused on times of change, including end-of-life, relationship breakdown, leaving the military, cancer diagnosis, and coming out as LGBTQ+ [3,6].

Starting. Stopping. Joining. Leaving. Evolving. We all go through times of change. Whatever we leave behind—a school, a job, a marriage, a gender identity—it's all change, and digital systems don't handle such changes well. Technological infrastructures, tools, and apps are designed well for onboarding and maintaining users. They are designed less well for supporting the natural evolution of our identities across the lifespan. When people want to change how they appear online—whether that be through more or different levels of privacy; editing, redacting, or deleting information already collected; or who they are connected to—technologies either fail them, present obstacles through the complexity of privacy management options, or (at the very least) cause extra and unwanted work.

We have encountered many examples of how technological infrastructures, tools, and apps let people down in these contexts unless they have a high degree of digital privacy literacy.

When digital privacy is working as a user wishes, the use of online channels can have positive effects during times of change. Users with greater digital privacy literacy may curate their content and audiences, sharing information thoughtfully and leveraging technology to alleviate the emotional labor of change. For example, an interviewee who was living with cancer recalled:

I planned carefully what and how I would share, and it has saved me having multiple and identical conversations, which gets very tiring. Before sharing, I had been ruthless in clearing out my "friends" list to include only those I see as genuine friends and people who I wanted to know what was happening to me.

They may also mobilize support and find acceptance of their "new normal," regardless of what their change is, as an interviewee leaving the military described:

Internet is full of trolls and bullies, but if you can distance yourself from that and find groups that are private and secure, then these groups are there to help....

Failure to proactively manage one's online privacy requirements during a time of change, however, can have persistent repercussions. An interviewee who was going through a breakup related how this can happen:

I was foolish really by thinking what I write was floating out into the ether, but somehow disappearing amongst all the other messages, never to be seen again. I guess I thought that people could see it for a few hours, and then it was lost. Sadly, they are a little more fixed in concrete than that.

That same interviewee described how news of change can easily leak out to the wrong people, engendering unwelcome online visibility and the associated negative ramifications:

I thought I had shared this information anonymously [online], but I got found out by an amateur Sherlock Holmes, and ever since it has screwed my personal life up!

Such visibility is amplified when personal data is aggregated across platforms and across time. For example, for someone living with cancer and looking for a new job, what effect might it have if a potential employer scrutinizes both their LinkedIn professional profile and their publicly visible cancer fundraising biography? Are they at risk of not being hired? What is the impact on personal safety for the owner of a small business who has escaped an abusive relationship and wants to hide where they now live when the U.K. government's Companies House reveals their home address as a matter of public record and lax Strava settings display their daily exercise route? Will their abuser be able to find them?

back to top  Relationship Breakdown

Now, let's dig deeper into one common time of change: relationship breakdown. Online, there may be a need to uncouple intertwined identities and adjust the networks and individuals we are connected to in order to reflect our new normal and who knows about it.

If the relationship has been abusive, it is vital to take steps to uncouple these intertwined identities and avoid the unwelcome sharing of personal data. Technological infrastructures, tools, and apps afford additional opportunities to transact abuse and to pursue the survivor after the relationship ends [7]. While domestic abuse takes many forms—physical, sexual, psychological, and economic—up to 72 percent of reported cases now also involve technology-facilitated abuse (TFA) by perpetrators who aim to intimidate, threaten, monitor, impersonate, harass, or otherwise harm survivors [8]. It is transacted across a wide range of devices, including smartphones, laptops, smart home devices, and GPS, and via channels such as social media and text messages, plus stalkerware, spyware, and monitoring tools.

Abusers can make it even harder for survivors to leave, seek help, or resist further harm via imposed restrictions on the survivor's digital privacy and the exploitation of digital technologies and survivors' associated personal data. Survivors may avoid reaching out online for support for fear of discovery. Personal data such as intimate images shared consensually within a relationship may be shared with a wider audience by the abuser, with the intention of causing shame or embarrassment. Location-sharing apps that once seemed like a cute way of staying in touch morph into mechanisms for intrusion and control, as one of our interviewees describes here:

By this stage, I had blocked him on Google because I was sick of getting his constant messages…but I didn't know that I hadn't also blocked the location services…. He…tracked me, knew I was [at an event], knew the exact time, how long I'd been there…. It was really creepy. It was really terrifying as well.

Knowledge of personal data may even facilitate economic abuse. One interviewee explained how knowledge of her email address and detailed personal identifying information enabled her abuser to carry out fraud:

We owned a rental property together. He took out a loan in my name online to pay for the insurance on it, so he didn't have to pay his share. I only found out by accident when I read what I thought was a spam email saying how much I owed! The police said it wasn't fraud as he was my husband—even though we were separated.

back to top  Next Steps?

Having shared some high-level framing of the mosaic of personal data and some personal examples from our research, I turn to the question of what can be done. What can we, as HCI, UX, and UxD professionals, do? Whether it is to defend against an abuser intent on TFA or to prevent a cancer diagnosis from being leaked to undesired audiences online, digital privacy settings need to be easier to manage. Some steps are being taken to improve digital privacy literacy. Currently, charities such as Refuge (http://refugetechsafety.org/) have excellent checklists that people can use to self-audit commonly used apps and services and make them more secure.

But this isn't enough. Expecting someone who is going through a period of significant change to spend time fiddling with complex privacy settings across multiple platforms is both unrealistic and unreasonable. It's also unfair. Real life should be allowed to take priority. It shouldn't be so hard to change who we are connected to or what information they see about us, or to avoid the potential adverse consequences of failing to attend to privacy settings.

There is an opportunity here for HCI, UX, and UxD professionals to develop usable privacy-enhancing tools that nimbly assemble an individual's mosaic of personal data across platforms and across time and reflect it back to the individual so that they can understand what they reveal of themselves, and to whom. And we need these tools embedded into apps and online services. This will afford people opportunities to manage their digital privacy during times of change and to shore up unforeseen privacy vulnerabilities at a time when, frankly, people have better things to do.

back to top  Acknowledgments

The research reported on in this article was undertaken in collaboration with Jo Briggs, Manchester School of Art; Lorna Gibson, University of Dundee; Aikaterini Grimani, Warwick Business School; Ryan Gibson, Diane Morrow, and Emma Nicol, University of Strathclyde; and Daniel Herron, Meta.

The research was funded under the following grants: Keeping Secrets Online, Centre for Research and Evidence on Security Threats; Cumulative Revelations of Personal Data (Engineering and Physical Sciences Research Council, EP/R033889/2); and AP4L: Adaptive PETs to Protect & emPower People during Life Transitions (Engineering and Physical Sciences Research Council, EP/W032473/1).

back to top  References

1. Parkinson, B., Millard, D.E., O'Hara, K., and Giordano, R. The digitally extended self: A lexicological analysis of personal data. Journal of Information Science 44, 4 (2018), 552–565; https://doi.org/10.1177/0165551517706233

2. Mitchell, B. Met Police 'passed victims' data to Facebook via online tracking tool.' The Standard. Jul. 16, 2023; https://www.standard.co.uk/news/london/met-police-victim-data-facebook-the-observer-suffolk-norfolk-b1094640.html

3. Armstrong, A., Briggs, J., Moncur, W., Carey, D.P., Nicol, E., and Schafer, B. Everyday digital traces. Big Data & Society 10, 2 (2023); https://doi.org/10.1177/20539517231213827

4. Kendall-Taylor, A., Frantz, E., and Wright, J. The digital dictators: How technology strengthens autocracy. Foreign Affairs 99, 2 (2020), 103–115.

5. Pozen, D.E. The mosaic theory, national security, and the Freedom of Information Act. The Yale Law Journal 115, 3 (2005), 628–679; http://www.jstor.org/stable/25047621

6. Moncur, W. Digital ownership across lifespans. In Aging and the Digital Life Course, Vol. 3. C. Garattini and D. Prendergast, eds. Berghahn Books, 2015, 257–273; https://strathprints.strath.ac.uk/85168/1/Moncur_ADLC_2015_Digital_ownership_across_lifespans.pdf

7. Grimani, A., Gavine, A., and Moncur, W. An evidence synthesis of covert online strategies regarding intimate partner violence. Trauma, Violence, & Abuse 23, 2 (2022), 581–593; https://doi.org/10.1177/1524838020957985

8. Christie, L. and Wright, S. Technology and domestic abuse. UK Parliament Post. Nov. 13, 2020; https://post.parliament.uk/technology-and-domestic-abuse/

back to top  Author

Wendy Moncur is full professor and cybersecurity group lead at the University of Strathclyde in Glasgow. Her interdisciplinary research focuses on online identity, privacy, trust, and cybersecurity, often at times of change. She has over 80 peer-reviewed publications across HCI, law, psychology, design, and AI. [email protected]

back to top 

Copyright held by author. Publication rights licensed to ACM.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2024 ACM, Inc.

Post Comment


No Comments Found