Resources

PERVADE: Pervasive Data Ethics for Computational Research

Katie Shilton (Co-Principal Investigator)
Matthew Bietz (Co-Principal Investigator)

Casey Fiesler (Co-Principal Investigator)
Jacob Metcalf (Co-Principal Investigator)
Arvind Narayanan (Co-Principal Investigator)

Jessica Vitak (Co-Principal Investigator)
Michael Zimmer (Co-Principal Investigator)

Big, pervasive data about people enables fundamentally new computational research, but also raises new ethical challenges, such as accounting for distributed harms at scale, protecting against the risks of unpredictable future uses of data, and ensuring fairness in automated decision-making. National debates have erupted over online experiments, leaked datasets, and the definition of “public” data. Investigators struggle to advise students on engaging vulnerable populations or navigating terms of service. Regulators debate how to translate traditional ethical principles into workable policy guidance. Research addressing these challenges has hit roadblocks caused by a lack of empirical knowledge about emerging norms and expectations. This project discovers how diverse stakeholders – big data researchers, platforms, regulators, and user communities – understand their ethical obligations and choices, and how their decisions impact data system design and use. It also compares stakeholder perspectives against the risks and realities of pervasive data itself, answering fundamental questions about the fairness and ethics of such research. Understanding how computing researchers adapt their practices in the big data era, and highlighting points of convergence or conflict with data realities, user expectations, and regulatory practices, will produce concrete guidance for pervasive data ethics.

Safely Searching Among Sensitive Content

Doug Oard (Principal Investigator)
Jimmy Lin (Co-Principal Investigator)

Large text collections such as email are critical information resources, but they also can contain sensitive information. For this reason,citizens can not yet search some government records because of the protected information they contain. Scholars are not yet allowed to see much of the growing backlog of unprocessed archival collections. These limitations are consequences of the fact that current search engines can only protect sensitive content if that sensitive content has been marked in advance. As the volume of digital content continues to increase, current approaches based on manually finding and marking all of the sensitive content in a collection cannot affordably accommodate the scale of the challenge. This project creates a new class of search algorithms designed to balance the searcher’s interest in finding relevant content with the content provider’s interest in protecting sensitive content.

Ethical Cultures in Computer Security Research

Megan Finn (Principal Investigator)
Jevin West (Co-Principal Investigator)

Franziska Roesner (Co-Principal Investigator)

Computer security researchers must navigate ethical dilemmas about how to use big data and shared networked resources to discover vulnerabilities; how to safely expose these problems; and how to best ensure that critical vulnerabilities are fixed.This project analyzes the scholarly discourse and private reflections of computer security researchers over time, to reveal insights about how people, changes in technology, and changes in research practices shaped ethical cultures in security research, and how ethics shaped research practice.

Find our recent poster describing curricular work here: WritingSecurity.

Ethical Computing in Mobile & Wearable App Development

Katherine Shilton (Principal Investigator)
Adam Porter (Co-Principal Investigator)
Susan Winter (Co-Principal Investigator)
ICONS Project (http://www.icons.umd.edu/)

In this NSF-funded project, the research team studies academic and commercial software research and development to discover factors that encourage discussion and action on ethical challenges. We will then incorporate findings into curricular materials for computer ethics by building interactive simulations both for the classroom and for massive online open course-ware (MOOCs). Project outcomes will answer the following questions:

  • What practices within mobile application research and development encourage discussion of, and decisions about, ethics?
  • How can these practices be incorporated into computer ethics education?
  • How do educational simulations based on these practices impact students’ learning and development practices?

National Science Foundation SES-1449351
Google Faculty Research Award


Past Projects

Values in a Future Internet Architecture

Katherine Shilton (Principal Investigator)
Jeff Burke, UCLA (Co-Principal Investigator)

Named Data Networking (NDN) is a long-term research effort to redesign the underlying architecture of the Internet. This project studies NDN’s impacts on social issues such as privacy, intellectual property, law enforcement, governance, and policy. The project investigates the distinction between values intended by developers in the NDN core architecture and values enacted in its implementation(s). Research questions include:

  1. How do values embedded in the NDN architecture become enacted in application design and use?
  2. What social issues are bound up in NDN technical problems?
  3. How can values-in-design perspectives help solve these technical problems?
  4. What interventions and strategies encourage values conversations within the technical work of infrastructure design?

The project addresses these questions using qualitative methods and targeted technical interventions. It uses a cooperative research approach, in which social scientists work alongside the NDN team of networking researchers. Project outcomes will include a detailed report on critical social, cultural, and economic considerations for the design of NDN, and future network architectures more generally. Other expected outcomes include technical changes to the NDN architecture based on this set of considerations and through cross-disciplinary dialogue.

National Science Foundation CNS-1421876

Privacy in Citizen Science

Katherine Shilton (Principal Investigator)
Jennifer Preece (Co-Principal Investigator)
Anne Bowser (Co-Principal Investigator)

Citizen science is a form of collaboration where members of the public participate in scientific research. Citizen science is increasingly facilitated by a variety of wireless, cellular and satellite technologies. Data collected and shared using these technologies may threaten the privacy of volunteers. This project will discover factors that lead to, or alleviate, privacy concerns for citizen science volunteers. The findings may support citizen science by exploring the privacy protection practices utilized by citizen science coordinators and volunteers. The results of this research will include best practices and policy guidelines for supporting privacy in citizen science. They will be published in a whitepaper distributed by the Woodrow Wilson International Center for Scholars and the United States Citizen Science Association, ensuring that broad audiences in public policy and in the citizen science community benefit from this work.

National Science Foundation SES-1450625

Consumer Privacy Expectations in the Mobile Ecosystem

Kirsten Martin, George Washington University (Principal Investigator)

This project examines US consumers’ privacy expectations across mobile data contexts. We are conducting a series of surveys using factorial vignette methodology, in which respondents answer questions based on a series of hypothetical vignettes. This method allows us to examine multiple factors—e.g., changes in context and types of privacy violations—simultaneously by providing respondents with rich vignettes, which are systematically varied. The method supports the identification of the implicit factors and their relative importance in deciding that a situation meets or violates expectations of privacy. Study results will identify mobile contexts with similar privacy expectations (e.g., gaming, shopping, socializing, blogging, researching, etc), and identify the relative importance of data use and contextual factors in developing privacy expectations for specific mobile data contexts.

UMD ADVANCE Program

 

Other Resources

Contact us to find out more information about gaining access to project data sets.