What is the compelling question or challenge?
For technology to promote a healthy society, it must benefit people equitably. How do we design for, evaluate, and monitor the distribution of effects, not just overall utility, in technical systems?
What do we know now about this Big Idea and what are the key research questions we need to address?
The next generation of large-scale computing systems must ensure an equitable distribution of systems’ benefits and impacts. Doing so requires equity and beneficence to be considered throughout the design, development, deployment, and monitoring of systems that affect the daily lives of members of society. This calls for a wide range of disciplines, including computer science and humanities. It starts with understanding our systems: Do outcomes from different systems equally serve all individuals, regardless of background or context? Do systems provide privacy and security to all users equitably? It also requires going deep into the needs and expectations of different systems to answer questions such as: Do systems meet the needs of all their users, or are the needs of a less visible population not recognized? How do we incorporate equity and beneficence into curriculum to train current and future workers and the next generation of researchers? What methods and tools are needed to investigate these questions?
Documenting system utility and impact is only the beginning. We must systematize that knowledge to predict the likely impacts of new technologies, enabling better ethical reasoning about the technologies we build. We need (i) modeling strategies and impact models that foster development of technologies that are resistant to negative effects or disparate impact, (ii) quality assurance techniques to detect such properties, (iii) the ability to monitor deployed systems to alert on undesirable social effects or disparate utility degradation, and (iv) the ability to identify when groups that we may not yet know to identify are being mistreated or ill-served. Doing so should be guided by the following directives:
Stakeholder participation. We need methods of engaging more and more diverse stakeholders across the product lifecycle. Participatory design provides a framework for involving users in the design process, yet, there are open challenges in scaling the idea so that relevant stakeholders can be involved in all stages of the lifecycle and in the assessment of complex systems.
Privacy and security. Systematically weaker privacy or security protections for vulnerable subsets of society will only exacerbate inequality. Research on privacy management highlights access control languages, different privacy settings in applications, and formal privacy policies, whereas existing tools and algorithms focus on enterprise/organizational privacy management. We need to include the main actor (the user), and provide a privacy framework with customizable policies since the definition of privacy varies from person to person based on their personality, cultural background, and users’ privacy policies are dynamic and change based on the information sharing context and environment.
Real-time monitoring. It is not enough to design and deploy systems with beneficence and equity in mind; we must monitor those systems to ensure they continue to provide equitable benefit as they are used and adapt to their environment and user input. Aggregating heterogeneous technologies and evaluating their effects on various groups requires analysis of distributed data in near real-time settings, with long-term archives allowing time-varying trends to be discovered and evaluated.
Formal methods. To guarantee a fair access and deployment of systems, we need to employ formal methods to express fairness and information adequacy requirements in a formal language (which involves developing a domain-specific language that expressive enough to capture fairness and adequacy, yet is verifiable), model complex systems and verify that the systems satisfy those requirements.
Effective accountability. The computing systems we are now seeing deployed may require new notions of accountability: how and to whom should they be held accountable, and who should be held accountable for their operation and effects?
Why does it matter? What scientific discoveries, innovations, and desired societal outcomes might result from investment in this area?
Without careful attention to the distribution of costs, benefits, and other effects, technological systems are likely to exacerbate inequality and reify existing social strata, rather than promote a healthy and flourishing society. For all people to benefit from technology, we cannot blindly rely on trickle-down effects; we need to consider who might be underserved by our current technologies and design to meet their needs. We need to consider who experiences negative effects from technology, and investigate how to mitigate negative impact and ensure it is not too concentrated. We need to consider who is able to effectively benefit from technical systems, and whether access to those benefits is fairly distributed.
The challenge of equity and beneficence will drive and require significant advances in formal methods, experimental protocols and statistical methods, monitoring and observation capabilities, and many other technical fields to not only develop for these challenges, but to do so in a modular fashion that enables reuse across social concerns. Fairness alone has numerous definitions, some of which are mutually incompatible; in order to make efficient progress on ensuring fairness and address other social and human difficulties, the methods and tools need to be adaptable and reusable with multiple metrics and problem framings.
Technology has the power to be a democratizing force and a social equalizer, but there is no reason to believe that its benefits will be well-distributed by default. Only a concerted effort in research, education, and application across disciplines and across technology lifecycle phases can ensure that technical systems achieve their potential to promote well-being for all of society. Developing the science of human impact at scale to the point where we can systematically understand, predict, and when necessary adjust disparate user experiences will require experts in Human-computer interaction, privacy, fairness in algorithms, information retrieval, real-time performance, formal methods, sociology, political science, psychology, and domain experts from specific fields of application. Science and technology studies provide a roadmap for this effort, but there is significant work remaining to integrate equity and beneficence across disciplines and throughout the product lifecycle.
If we invest in this area, what would success look like?
Our primary objective is to minimize the negative impacts of data and technology on marginalized groups in society. This requires such situations be identified or predicted, understood, and remedied, and for everyone involved in the product lifecycle and its evaluation to be trained in the necessary methods and modes of thought. This will require curriculum design across the educational spectrum and across disciplines, from primary school to professional continuing education.
The first step towards success will be developing definitions of and means of assessing equity and beneficence in sociotechnical systems, and their ability to provide benefit to all users regardless of their social status, membership in protected groups, or other characteristics that should not determine quality of service. For example, does a smart grid system provide equitable service to all its users, or does it achieve its aims at the disproportionate expense of lower-income neighborhoods?
Long-term success will ultimately manifest as reduced disparity of advantages between various groups.
Why is this the right time to invest in this area?
The National Science Foundation (NSF) is investing heavily on harnessing the data revolution, technology frontier, the increased deployment of formal methods in the field, and convergence research. These efforts are the pillars that will drive development of significant technologies that will benefit society. These technologies and their involvement in our daily lives, the capacity to harness, connect and understand volumes of data that grown on a per-minute basis, the ability to bring multiple stakeholders into the table, model system requirements and user interactions, and the ability of holistically think solutions from multiple complementary yet heterogenous disciplines will be a must in focusing on solutions that respond to and are seamlessly adopted by our evolving society needs. Our main concern moving forward is with how existing research and development efforts will be integrated, sustained, and distributed among the members of society.
References
Friedman, B., & Nissenbaum, H. (1996). Bias in Computer Systems. ACM Transactions on Information and System Security, 14(3), 330–347.
Selbst, A. D., Boyd, D., Friedler, S., Venkatasubramanian, S., & Janet Vertesi. (2018). Fairness and Abstraction in Sociotechnical Systems
Katz, G., Barrett, C., Dill, D., Julian, K., & Kochenderfer, M. (2017). Reluplex: An Efficient SMT Solver for Verifying Deep Neural Networks. International Conference on Computer Aided Verification.
Show MoreShow Less