Session Submission Summary
Share...

Direct link:

Preconference: Digital Inequalities and Discrimination in the Big Data Era

Thu, May 25, 8:00 to 16:00, Hilton San Diego Bayfront, Floor: 4, Sapphire 400A

Session Submission Type: Panel

Abstract

A growing number of ordinary objects are being redesigned to include digital sensors, computing power, and communication capabilities – and new objects, and processes, are becoming part of the Internet. This emerging Internet of Things (IoT) ecosystem – networks of physical objects embedded with the ability to sense, and sometimes act upon, their environment, as well as related communication, applications, and data analysis, enables data to be collected from billions of everyday objects. The emerging datasphere made possible by these developments offers immense potential to serve the public good by fostering government transparency, energy conservation, participatory governance, and substantial advances in medical research and care. On the other hand, a growing body of research addresses emerging privacy and civil liberties concerns related to big data, including unjust discrimination and unequal access to data and the tools needed to make use of it.
For example, big data analytics may reveal patterns that were previously not detectable. Data about a variety of daily tasks that seem trivial is increasingly being federated and used to reveal associations or behaviors, and these analyses and the decisions made based on them pose potential harms to individuals or groups. Many transactions that seemed innocuous can now be used to discriminate – one’s movement throughout the day, items purchased at the store, television programs watched, “friends” added or looked at on social networks, or individuals communicated with or who were in close proximity to the subject at various times, can all be used to make judgements that affect an individual and his or her life chances. With the advent of artificial intelligence and machine learning, we are increasingly moving to a world where many decisions around us are shaped by these calculations rather than traditional human judgement. For example, sensitive personal information or behaviors (e.g., political or health-related) may be used to discriminate when individuals seek housing, immigration eligibility, medical care, education, bank loans or other financial services, insurance, or employment. At the same time, individuals, groups, or regions may also be disadvantaged due to a lack of access to data (or related skills and tools) to make use of big data in ways that benefit their lives and communities.
This preconference session seeks to advance understanding of digital inequalities and discrimination related to big data and big data analytics. Papers between 5,000-8,000 words and position papers between 1,000-2,000 words are welcomed.

Sub Unit

Chairs

Individual Presentations