XXVII.6 November - December 2020
Page: 57
Digital Citation

Technologies that disempower

Aaditeshwar Seth

back to top 

Paper-based bureaucratic procedures to access government services such as welfare measures can be user unfriendly and riddled with red tape [1]. In India, despite extensive work over the past decade in the digitization of such processes, government welfare services remain hard to access by low-income populations, causing significant distress among people during the Covid-19 pandemic. These problems are systemic in nature, rooted in the design of e-governance technologies. Here, we focus on several design flaws that manifested themselves during the pandemic.

back to top  Insights


To curtail the rapid transmission of the virus, the Indian government imposed a stringent lockdown of approximately 75 days, during which millions of people lost their jobs and valuable sources of income. The government announced several emergency relief measures with both cash and in-kind transfers, yet many people remained excluded because of technology-related issues. In this article, we describe these issues and also briefly discuss why design errors continue to emerge with technologies deployed by the state.

We were able to learn about these reasons for exclusion through an interactive-voice-response-based community media platform operated by the social enterprise Gram Vaani, on which people recorded their grievances and experiences in their own voices [2].

back to top  Technology can be Widely Adopted Without a Clear Use Case

Two primary relief measures announced by the government were 1) an increase in the quota of subsidized food for the poor through the Public Distribution System (PDS) and 2) cash transfers to the bank accounts of people registered under government schemes on agriculture, banking, employment, and other areas. Both these schemes use a unique-ID-based authentication system called Aadhaar, which relies upon fingerprint biometrics to authenticate people at the points of service, which are ration-shop dealerships that distribute subsidized food, and rural bank branches or banking correspondents that enable people to access their bank accounts from remote areas.

During the lockdown, biometric fingerprint failure unfortunately emerged as the most significant source of error in the Aadhaar-based authentication process for both the PDS and the cash withdrawal system. People facing such errors were unable to access their food rations [audio report] and went hungry. The reasons for biometric failure have been well documented, from machine failures to illegible fingerprints, and also possible changes to fingerprints as people age. It therefore raises the question of why Aadhaar was built to use biometrics in the first place.

Audio-enabled POS machines could help level the power imbalance between the banking agents and consumers.

Biometrics were originally deemed essential in Aadhaar for de-duplication, so that nobody could have more than one Aadhaar card. However, no studies have been published by the government on the efficacy of fingerprint biometrics for de-duplication, and one paper in fact claims a high false rejection rate if biometric de-duplication is implemented [3]. Clearly, if biometrics do not help with the one use case that motivated its use, then there is no need for it. Even simple smartcard-based solutions could have been used for point-of-service (POS) authentication. Several states did suspend biometric-based authentication for PDS during the lockdown, further corroborating the technology's limited value.


back to top  Technology Design can Disempower Citizens and Reproduce Inequalities

Who understands and wields a technology is a key driver in determining whom it empowers and disempowers. With cash transfers announced by the government, banking correspondents using Aadhaar-enabled POS machines were critical in delivering banking services to people at their doorstep. However, widespread cases of fraud arose, aided by the design peculiarities of the POS machines. The machines are administered by agents, and there are no displays or audio notifications to communicate to the beneficiaries which specific transactions have actually been performed after the beneficiaries have authenticated themselves. It was therefore easy for the banking correspondents to underreport to customers [audio report] the amount of cash that could be withdrawn by making false verbal claims that an inactive-account fee had been applied [audio report]—and then pocketing the difference. Some went even further, telling customers that their bank account had been deactivated [audio report], while all the time the accounts were functional and withdrawals were successfully conducted. A beneficiary-centric design of the POS machines could have empowered them. For example, audio-enabled POS machines that speak out each and every transaction, explain the error codes in simple terms, and suggest appropriate actions in the case of failure, could help level the power imbalance between the banking agents and consumers.

The same principle is applicable to ration shops, where the dealer clearly has opportunities to misquote problems to the beneficiaries. The dealers have been known to falsely claim a shortfall in stock [audio report] and not give people their full units of food. The beneficiaries have no way to check, and must go with whatever the dealer says [audio report]. By placing technology in the hands of the dealer—technology that the beneficiaries do not understand—the inherent power differential between them gets sharper and can manifest itself in other dealings between them. Rather, technology should be designed for the weakest stakeholder in the ecosystem [4]. If the ration dealer wants to claim a shortfall, then the technology could mandate that they record an oral testimony right there at the point of service to confirm the shortfall and also raise a complaint upstream in the PDS system, which could then be audited by the food-supply department.

Further, entirely different technologies can be envisioned that would empower the disempowered and plug PDS quantity fraud by the ration dealers. Technologies that inform people through SMS or IVR or other means about stock deliveries that have happened at their ration shops, about the correct number of units of food due to them, about grievance-redressal mechanisms to raise complaints, and about how to resolve them, can go a long way in this direction of empowering the people. Even nontechnological solutions in some states, where the ration shops are operated by the community with support from women's self-help group (SHG) networks, have made strides in remedying this problem.

back to top  Technology May be Poorly Designed

Organized sector workers in India whose income is below a certain threshold are supposed to be registered under the Provident Fund (PF) scheme, a financial social security scheme to which both the employer and the worker contribute monthly installments. During the lockdown, the government allowed people to make additional withdrawals from their PF accounts, but countless workers were unable to withdraw because of mismatches in their details. These included the spelling of names [audio report] in their PF account that did not match the name in their bank account or in the Aadhaar system, and similar mismatches in their date of birth, dates of employment, mobile phone number, and so on. The workers were unable to make these corrections because the PF IT system allows only employers or ex-employers to make these changes. However, employers are hardly responsive to such issues because they do not have strong incentives to correct errors on behalf of their workers. It also does not help that many workers do not even know that their PF was being deducted [audio report], or what their PF account number is [audio report], or the procedures to withdraw funds [audio report]. It is not hard to conceive of a setup where the workers were well informed of these deductions through SMS messages, including writing the PF account number on their pay slips, and offering a user-facing system so that workers can fix the errors on their own, rather than rely on employers.

back to top  Even Well-Designed Technology Can Miss Opportunities For Citizen Empowerment

To provide quick employment to those out of work, the Mahatma Gandhi National Rural Employment Guarantee Act (MNREGA) scheme for rural employment guarantees emerged as a strong fallback mechanism for unskilled laborers. The MNREGA management information system (MIS) is highly impressive, housing each worker's details, their job card, past projects, attendance, and payment status, all queryable in almost real time over the Web. However, the people doing MNREGA work are unable to use online systems to check their accounts without help from others. The MIS thereby serves largely as an accounting tool to keep track of various work assignments and payments, or initiate new ones, and has ultimately remained as a system for the administration rather than for the workers. The workers continue to face issues with registering their requests for work [audio report], tracking payments [audio report], and raising grievances [audio report]. This is a significant missed opportunity in making citizen-friendly government systems. IVR or app-based interfaces through which workers could access their own work history or payments [5] could be easily implemented. It is a rightful enhancement, which would enable workers to easily check if the work they do is being correctly logged and that they are getting their due wages accordingly, and to request new work when they need it.

back to top  Technology May Fail to Empower People Due to Last-Mile Infrastructure Limitations

While there was nothing faster than direct bank transfers to provide instant cash relief to people who were suddenly left without any income, the limited physical presence of banking infrastructure in rural areas prevented people from easily acquiring the cash. There was massive overcrowding at the few functional bank branches, highlighting the fact that financial digitization cannot happen without also scaling up the physical and human infrastructure of banking services [6]. People had to walk far to their banks and wait for hours in line [audio report], often to return empty-handed when the Internet at the banks was down [audio report]. While such issues have been highlighted in the past, such as in a failed government pilot in Jharkhand and in problems with maintaining people's active bank accounts, cashless payments are being adopted by more and more government schemes, even as the reach of banking and ATM infrastructure is slowing down in rural areas. Such examples highlight the fallibility and insufficiency of technology in solving access problems, yet we see little effort toward developing appropriate processes to handle cases of errors, misuse, and technology failure.

back to top  Technology Can Amplify Discrimination and Distrust Among People

The Indian government released a contact-tracing mobile app called Aarogya Setu, mandating its use among groups such as migrant workers, who returned to their rural homes in large numbers when they lost work in the cities. However, migrant workers faced widespread discrimination [audio report] in their villages due to fear of their transmitting the virus; these dynamics were reflected in the adoption of Aarogya Setu. While the app was welcomed by local residents who saw the surveillance as useful, many others expressed reservations. If they did come out as at risk or infected, they feared they could be forced into a poorly run isolation center [audio report], or that their entire household could be confined and their families would face discrimination in the community.

Technology does not operate in a vacuum; it is used by people situated in particular contexts, and its use is shaped by perceptions of trust.

It is therefore important to remember that technology does not operate in a vacuum; it is used by people situated in particular contexts, and its use is shaped by perceptions of trust. If this environment has trust deficits, then the technology might enhance these deficits, alienating people from the technology [audio report] and from one another. Even worse, it could be used as a tool by malicious actors to deepen the deficits and mistrust. As highlighted by many, concerns about the ambiguous data-privacy policies of contract-tracing apps as well as doubts about their usefulness reveal additional pathways through which trust is disrupted when the sociological context of technology deployment is ignored.

back to top  Technology Can Be Marketed to Serve Misguided Objectives

Aadhaar was clearly conceived as a system to reduce inclusion errors in welfare schemes, that is, to deny welfare benefits to those who don't deserve them, and was likewise marketed as a means to plug such leaks. But instances of unauthorized access to welfare benefits are, however, minuscule compared with the leaks caused by the ration dealers through quantity fraud, or the exclusion errors affecting millions of families who were denied ration benefits for having remained without ration cards [audio report] even during this period of great need. Had the focus been on reducing exclusion errors, then there would be little reason to formulate a solution like Aadhaar. In fact, even with Aadhaar, a simple analysis of the logs can reveal beneficiaries for whom biometric re-registration or name correction may reduce avoidable exclusion errors, but the government has not taken such steps. Instead, the narrative that persists is the need to reduce inclusion errors rather than exclusion errors. This is clearly a misguided objective to justify the need for Aadhaar; the more relevant problems to solve are quite different.

back to top  What Drives the Adoption of Technologies That Further Disempower the Vulnerable?

These problems of poorly thought through technology design, ignorance of the actual operational context, contrived problem statements, and inappropriate management of the sociotechnological interface with mishandling of failure cases and wider public communication, are seen again and again. The latest wave to be unleashed is that of AI-inspired technologies in areas such as assessing the creditworthiness of low-income people using mobile call-data records, risk scoring for criminal recidivism, and performance assessment of human resources across a large range of industry sectors. These are enthusiastically embraced by many governments, but these applications are fraught with the same risks and may end up disempowering the vulnerable.

With such grave issues in the vast technological systems that enable access to welfare benefits for millions of people, why does the state adopt these disempowering technologies? Why are these technologies not modified to minimize harm? Several theories can explain this, one being that governments have a strong belief in high modernism, of technology having the potential to control its citizens and make them legible to the government, which can aid in national security, the targeting of benefits, tax collection, and so on [7]. This leads the state to look for solutions that ensure compliance and catch misconduct, but not to look for solutions that ensure inclusive access and equitable distribution. Another theory is around capitalism's need for constant technological innovation [8], which seeks out new markets, including marketing technologies to the state. If the customer is the state and not the beneficiary, then capital will build technologies for the state and not the beneficiary; further, it will exercise a strong influence in the uptake of these technologies by projecting them as silver bullets that can solve the state's concerns. In India too, the government's desire for technology that facilitates centralization has been cleverly serviced by capital's ingenuity to provide such technology [9]. In fact, the common preference of both the state and capital to control and predict population-scale behavior may explain their mutual agreement on convergence toward greater centralization of technology. Citizens must reject this flawed government mindset of the wholesale adoption of disempowering technologies. And as technologists building many such technologies, we should also reconsider contributing to their design in the first place [10].

back to top  Acknowledgments

I would like to thank Jean Drèze and Subhashis Banerjee for their feedback and useful pointers for the article.

back to top  References

1. Graeber, D. Dead Zones of the Imagination: On Violence, Bureaucracy, and Interpretive Labor. The Malinowski Memorial Lecture, 2006.

2. Moitra, A., Das, V., Kumar, A., and Seth, A. Design lessons from creating a mobile-based community media platform in rural India. Proc. of the Eighth International Conference on Information and Communication Technologies and Development. ACM, New York, 2016, Article 14, 1–11.

3. Mathews, H.V. Flaws in the UIDAI process. Economic and Political Weekly 51, 9 (2016).

4. Duquenoy, P. and Thimbley, H. Justice and design. Proc. of International Conference on Human-Computer Interaction, 1999.

5. Vivek, S., Vardhan, V., Kar, S., Asthana, S., Narayanan, R., Singh, P., Chakraborty, D., Singh, A., and Seth, A. Airavat: An automated system to increase transparency and accountability in social welfare schemes in India. Proc. of the Sixth International Conference on Information and Communications Technologies and Development: Notes - Volume 2. ACM, New York, 2013, 151–154.

6. Vivek, S., Narayanan, R., Chakraborty, D., Veeraraghavan, R., and Vardhan, V. Are technology-enabled cash transfers really 'direct'? Economic and Political Weekly 53, 30 (2018).

7. Scott, J.C. Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed. Yale Univ. Press, 1999.

8. Harvey, D. The fetish of technology: Causes and consequences. Macalester International 13, 7 (2003).

9. Sen, A. et al. An attempt at using mass media data to analyze the political economy around dome key ICTD policies in India. Proc. of the 10th International Conference on Information and Communication Technologies and Development. ACM, New York, 2019, Article 21, 1–11.

10. Seth, A. A call to technologists. 2019; http://www.cse.iitd.ernet.in/~aseth/call-to-technologists-2019.pdf

back to top  Author

Aaditeshwar Seth is a faculty member in the Department of Computer Science and Engineering at IIT Delhi in India, and also cofounder of Gram Vaani, a social enterprise that uses voice-based technologies to empower rural and low-income communities to run their own participatory media platforms. aseth@gramvaani.org

back to top 

©2020 ACM  1072-5520/20/11  $15.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2020 ACM, Inc.

Post Comment

No Comments Found