COVID19, Technology, and Social Inequality

coronavirus
Photo by CDC on Pexels.com

The COVID19 pandemic is an unprecedented global public health challenge. It’s additionally troubling that its impact and our response to it throws into sharp relief many social inequalities in areas like healthcare, education, and employment that have existed long before the pandemic. Technology, as it’s used in each of these sectors, has historically co-created these inequalities as well as strived to mitigate them. Now, as we reach for technology to respond to the COVID19 crisis, we need to do it in such a way that doesn’t exacerbate these inequities and preserves fundamental rights like privacy, fairness, and human dignity. This challenge is daunting. 

People of color are disproportionately impacted by COVID19. This is because they are more likely to be frontline essential workers, have pre-existing health conditions, and be uninsured or underinsured. In a digital world, technology of course mediates many of our healthcare experiences, and this is only brought to the fore during the pandemic when we are increasingly relying on technology to connect and access services. For example, are virtual doctor visit technologies reaching all communities equally and providing opportunities for those who are sick to seek care early? Are healthcare institutions collecting and analyzing “big data” from COVID19 cases from all racial groups to understand how the virus impacts all people, given how medical research has historically excluded women and people of color? Will machine learning technologies that will undoubtedly be deployed to analyze this data be fair and unbiased toward people of color, as we know they often are not

Inequalities in education are coming into stark relief now that schools are closed and classroom teaching is moving online. Underfunded schools serving low-income students are struggling to quickly deploy safe and reliable video conferencing and other educational technologies, while better funded public and private schools have a leg up in leveraging their existing, robust technology infrastructures. When schools do manage to deploy virtual technologies, they frequently struggle with how to make sure all students are included, particularly those who don’t have reliable high-speed bandwidth, unlimited data plans, modern personal devices and online subscriptions to various educational resources or productivity software that make learning from home easier. Faced with disproportionate negative impacts of limited technology access on students of color and in the context of Title VI obligations, some districts early in the outbreak recommended that schools remain closed and make up the instruction time later. Now that it’s clear that long school closures are inevitable, how will technology empower or hamper schools from providing quality education in a non-discriminatory and equitable way? 

Less than one third of Americans is privileged enough to be able to work from home during widespread shutdowns. But even these workers are being required to use technologies (that they typically don’t get to choose) for all of their distance communication, thus replacing private hallway, lunchtime and water cooler conversations with digital ones that may be tracked, recorded or monitored. Moreover, some companies are using the pandemic as an opportunity to institute or extend employee monitoring practices in a misguided effort to boost work-from-home productivity. In the meantime, companies are planning for a return to the office, and those who were never able to work from home are already experiencing new privacy-invading practices in the workplace. These range from temperature checking to symptom or diagnosis reporting. While some of these procedures may prevent infection and protect employee health and lives, what guarantees do employees have that these technologies are accurate and that their personal health data won’t be used against them or be appropriated for some other purposes related to their employment once the pandemic is over? 

All of us are impacted by COVID19 as technologists reach for, unsurprisingly, technology solutions to the pandemic. Some include COVID19 screening programs while others aim to increase blood donations from recovered patients so they can be used to develop therapies. The impact of technology has come to the fore in conversations around digital contact tracing that would potentially improve upon the traditional method of human contact tracing, the primary way for stemming infectious disease outbreaks. Contact tracing apps, and the Google/Apple API enabling them, have the ability to record who you’ve interacted with and notify you if any of those people report a positive COVID19 diagnosis so that you can shelter in place or seek medical care. While potentially more accurate than human memory and faster than doing dozens of human interviews, bluetooth-enabled proximity tracing apps have their own inaccuracies that don’t make them 100% reliable. In addition, these apps pose serious privacy concerns especially if they are made mandatory, track location data, and share information automatically with those with whom you’ve been in contact or health authorities. Furthermore, they may not paint a representative picture of the actual spread of the infection as they may miss less privileged communities who don’t have adequate connectivity for these apps to work. How should we weigh potentially life-saving tech solutions with other values we hold dear like privacy, fairness, and justice?

While some, including unfortunately the US President, may be surprised by these trends, many researchers, sociologists in particular, have rigorously studied the persistent reproduction of structural inequality in America (and globally), and although disheartened by it, are not especially surprised. Technology is just another social practice that cannot be divorced from the social institutions in which it is embedded. Thus it also shouldn’t be surprising that technology can be both part of the problem and part of the solution to COVID19. We can choose for it to be the latter, but only if we do it carefully with human dignity and justice in mind. 

 

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: