2021 Conference on Privacy Engineering Practice and Respect (PEPR)

$30-$150 June 10-11, 2021 @ 11:00am - 5:45pm ET

Overview

PEPR 2022 will be hosted by USENIX from June 23-24, 2022. Please visit this page for more details.

PEPR is focused on designing and building products and systems with privacy and respect for their users and the societies in which they operate. Our goal is to improve the state of the art and practice of building for privacy and respect and to foster a deeply knowledgeable community of both privacy practitioners and researchers who collaborate towards that goal.

View Past Conferences

For questions or suggestions, please email us at [email protected].

Attend

Agenda

DAY 1 – THURSDAY, JUNE 10 (All Times EDT)

Time

Item

Speakers

11:00 am –
11:10 am

OPENING REMARKS

Lea KissnerTwitter

Lorrie CranorCarnegie Mellon University

11:10 am –
12:25 pm

SESSION 1 – PRIVACY AT SCALE

Moderator: Caitlin Fennessy, IAPP

Privacy for Infrastructure: Addressing Privacy at the Root – Watch Recording

Download Presentation Slides

  • Joshua O’Madadhain, Gary YoungGoogle
  • Abstract: Public sentiment, and policy, for privacy have largely focused on product design and implementation, where privacy issues are most visible; as a result, privacy assessments are often product-focused. However, products’ privacy problems often have their roots in their infrastructure. This means both that the infrastructure may not get appropriate scrutiny, and that privacy assessments, when applied to infrastructure, may not be asking the right questions. Implementing privacy protections as far down as possible in the infrastructure stack provides those protections to all users of all clients of that infrastructure.  Contrariwise, infrastructure with privacy vulnerabilities externalizes the cost of addressing those vulnerabilities on all of its clients. We present some strategies for addressing privacy concerns at the infrastructure level.

Cryptographic Privacy-Enhancing Technologies in the Post-Schrems II Era – Watch Recording

Download Presentation Slides

  • Sunny Seon KangData Privacy Attorney
  • Abstract: In November 2020, the European Data Protection Board (EDPB) published guidelines on ‘supplementary tools to ensure compliance with the EU level of protection of personal data’ following the seminal Schrems II decision. In this guidance, “split or multi-party processing” is recommended as a way to preserve privacy in cross-border computing environments.
    This talk examines in plain terms how this technical privacy safeguard works in practice, and the EDPB’s policy rationale for recommending decentralized analytics as a supplementary measure. Secure multi-party computation, or MPC, is a cryptographic privacy-enhancing technology that enables multiple parties to compute on an aggregated dataset without seeing each other’s data inputs. MPC has helped institutions to: (1) overcome data localization, (2) facilitate critical knowledge-sharing, and (3) diversified data sources to minimize bias in AI, without compromising privacy. The talk concludes with a call-to-action on mobilizing privacy-enhancing technologies with further interdisciplinary engagement between technologists, lawyers, and policymakers.

Detecting and Handling Sensitive Information in a Data PlatformWatch Recording

Download Presentation Slides

  • Megha Arora, Mihir PatilPalantir Technologies
  • Abstract: One critical aspect of incorporating privacy in data management solutions is the process of classifying data as sensitive or personal. However, detecting & acting on sensitive data is a challenging task at scale, as different organizations and industries have their own definitions for what sensitive data constitutes. Moreover, such a solution ideally not only detects sensitive data, but also provides functionalities to handle, limit or minimize it in some capacity. In this talk, we break down the facets of what it means to identify and control potentially sensitive information flowing into a data platform, along with the design considerations of building a productized solution to solve this problem. To inform this goal, we’ll explore both the technical solution itself, as well as our learnings from deploying this solution across different fields, diverse regulatory requirements and varying user privacy expectations.

12:25 pm –
12:45 pm

BREAK

12:45 pm –
1:35 pm

SESSION 2 – CONSENT

Moderator: Lorrie Cranor, Carnegie Mellon University

Designing Meaningful Privacy Choice Experiences for UsersWatch Recording

Download Presentation Slides

  • Yuanyuan FengCarnegie Mellon University, Yaxing Yao, University of Maryland
  • Abstract: Recent data privacy regulations worldwide, including the General Data Privacy Regulation in the European Union and the California Consumer Privacy Act in the State of California, have established new requirements for privacy choices in the digital world. However, system practitioners often struggle to implement legally compliant privacy choices that also provide users with meaningful privacy control. In this talk, we will introduce a comprehensive design space for privacy choices, which derived from a comprehensive user-centered analysis of how people exercise privacy choices in real-world systems. This design space is a taxonomy of five important dimensions that should be considered before implementing privacy choices for their systems, providing system designers and developers a practical guide to design meaningful privacy choice experiences for users in addition to achieving legal compliance.

Engineering a Consent Sandbox to Eliminate Annoying Pop-Ups and Dark Patterns Watch Recording

Download Presentation Slides

  • Benjamin BrookTranscend
  • Abstract: Popups are an annoying ghost of internet history that just can’t seem to stay dead. In 2002, it was “you are the one-millionth visitor!” In 2021, it’s “this website uses cookies.” At a time when UX engineers are focused on providing a clean user experience, the web is suddenly cluttered once again with distracting, annoying pop-ups.This presentation will explain the path our engineering team took to find a way to eliminate pop-ups while maintaining regulatory compliance. We’ll share our experience exploring various alternatives and how we designed a way for users to land on a website without a banner while retaining tracking events locally on the user’s device until an appropriate time to ask for user consent and release the events to the site owner.

1:35 pm –
2:15 pm

SESSION 3 – RESPECTFUL ETHICS IN PRACTICE

Moderator: Lea KissnerTwitter

Panel: Michelle Finneran DennedyPrivatus Consulting, Andy SchouGoogle, Rumman Chowdhury, META Director, Twitter

Watch Recording

In this panel, our experts will discuss one of the hardest problems facing privacy engineers: making ethical choices in sticky situations. Privacy engineers and those working to build respectful products and systems are very often those on the front lines of finding problems — especially those problems which disproportionately affect marginalized groups. There very often aren’t any right answers in these situations, so how do we choose given the difficult world in which we live. Come with your questions, come to discuss. This panel is intended to further this conversation and deepen all of our insights.

2:15 pm –
2:55 pm

NETWORKING BREAK

Birds of a Feather Topics:

  • Leadership And Structure Of Privacy
  • Teams Consent As Privacy Engineering
  • Data Ownership, And How To Move Away From Monolithic Data Systems
  • Measurement & Making The Business Case For Privacy
  • Privacy Engineering: Employee Training
  • Ask An NSF Program Officer Anything
  • Research Data Sharing: Challenges And Opportunities

2:55 pm –
4:10 pm

SESSION 4 – DATA DELETION

Moderator: Nuria Ruiz, Outschool.com, Wikipedia Volunteer

Deletion Framework: How Facebook Upholds its Commitments Towards Data DeletionWatch Recording

Download Presentation Slides

  • Benoît ReitzFacebook
  • Abstract: Deletion is a core expectation from users using social networks: they trust us with their data, and trust that they can be forgotten if they want to. This talk will address how Facebook supports data deletion by creating a Deletion Framework, providing deletion guarantees at the company level. Topics we will discuss include, how we reduce the product developer’s involvement to a minimum by building on top of a structured data type specification language, how we protect ourselves from data losses by maintaining restoration logs, and how we find and clean up orphaned data at scale.

The Life-Changing Magic of Tidying Up DataWatch Recording

Download Presentation Slides

  • Nandita RaoDoorDash
  • Abstract: Enterprise data is growing exponentially and research estimates that up to 70% data is no longer relevant, adequate, or necessary for carrying out the purpose for which it is processed. Data hoarding is a dangerous business strategy that can lead to privacy non-compliance and breaches. Organizations need a KonMari Method™ to tame the data sprawl and adopt data hygiene processes. This session will introduce a continuous data discovery and analytics approach to support defensible deletion of data that doesn’t ‘Spark Joy’. Using real-world case studies across industries, we will demonstrate how practical data minimization was achieved through each stage of the data lifecycle.

“A Little Respect” – Erasure of Personal Data in Distributed SystemsWatch Recording

Download Presentation Slides

  • Neville SamuellEthyca
  • Abstract: The finality of the word “erasure” doesn’t capture the complexity of performing GDPR or CCPA-compliant data deletion across large, distributed systems. It’s not just the deletion that’s challenging. It’s the strategies needed to delete the data in compliance with relevant laws while maintaining the referential integrity necessary to support ongoing business requirements. In this code-driven walkthrough, Neville Samuell, VP Eng at Ethyca, will provide a reference implementation for privacy engineers to use as a blueprint for successful erasure across distributed systems.

4:10 pm –
4:30 pm

BREAK

4:30 pm –
5:45 pm

SESSION 5 – DESIGN

Moderator: Divya Sharma, Google

Illustrating Privacy Engineering Concepts with Potty TalkWatch Recording

Download Presentation Slides

  • Lorrie CranorCarnegie Mellon University
  • Abstract: While it may not be a topic that people like to talk about (unless you are a parent of young children), bathrooms are surprisingly useful for conveying concepts related to both privacy and usability. I will begin by presenting bathroom-related images I collected when I asked people to draw pictures of privacy for the Privacy Illustrated project. Then I will talk about how I took inspiration from a 2014 thought experiment/hoax to develop an educational exercise in which students first design notice and choice mechanisms for use of smart toilets in public restrooms and then discuss how they would evaluate the effectiveness of such mechanisms. Finally, I will discuss how other images from bathrooms (hard-to use showers, high-tech toilets with bidet features, etc.) can be used to illustrate usable privacy and security concepts in an accessible and entertaining manner.

Privacy UX At Scale: Foundational “truths” and design guidelinesWatch Recording

Download Presentation Slides

  • Manya Sleeper, Johanna WollGoogle
  • Abstract: Recent regulations and media coverage demonstrate that the privacy design space is rapidly evolving and increasingly important. At large tech companies, like Google, it is an ongoing effort to account for users’ privacy needs across products and surfaces, particularly as the technology landscape changes. In this session, members of Google’s Privacy UX (user experience) team will speak about providing baseline guidance to allow teams to design for privacy UX at scale. We will walk audience members through the challenges of scaling UX insights, as well as the approach we have taken — the development of 3 consolidated foundational privacy UX research truths, and 3 related design guidelines to address these truths in practice.

Is it Time Our Devices Showed a Little Respect? Informing the Design of Respectful Intelligent SystemsWatch Recording

Download Presentation Slides

  • William SeymourKing’s College London
  • Abstract: We often use the word respect to refer to how our systems should (and hopefully do) treat users, but what does this actually mean? In this talk we dive into a diverse variety of perspectives on what it means to respect people, exploring the role that respect plays in system design. Beginning by grounding the discussion in what respect means from a moral and social perspective, we then consider how this plays out in practice. Designers and developers might respect (or disrespect) their users through the artefacts they create, but also through the ways that systems mediate people’s communications and relationships. We conclude by discussing how respect complements and extends existing design principles in HCI around identity, fairness, and accessibility, distilling out key design principles to support the design of more ethical systems.

DAY 2 – FRIDAY, JUNE 11 (All Times EDT)

Time

Item

Speakers

11:00 am –
11:15 am

OPENING REMARKS

Lea Kissner, Twitter

Lorrie Cranor, Carnegie Mellon University

Remarks from U.S. Representative Haley Stevens

11:15 am –
12:30 pm

SESSION 6 – HEALTHCARE PRIVACY

Moderator: Giles Douglas, Nuna, Inc.

Towards More Informed Consent for Healthcare Data Sharing Watch Recording

Download Presentation Slides

  • Sarah PearmanCarnegie Mellon University
  • Abstract: As health apps and wearables become more prevalent, consumers are encouraged to carefully evaluate companies’ privacy practices before they disclose health information to entities other than their healthcare providers. In this work we explore the degree to which that is actually possible for users, via a case study of a consent flow for a chatbot being developed by a U.S. health insurer. Because their chatbot uses third-party APIs, users must view and agree to a HIPAA authorization before using the tool. In a two-phase interview and survey study we tested usability and user understanding of four versions of the consent flow. While our case study focuses on a particular application, it will also offer broader insights regarding people’s (mis)understandings of HIPAA, their preferences around healthcare privacy practices, and the extent to which they are likely to read and understand simplified disclosures.

Building a Scalable, Machine Learning-Driven Anonymization Pipeline for Clinical Trial TransparencyWatch Recording

Download Presentation Slides

  • David Di Valentino, Muqun (Rachel) LiPrivacy Analytics
  • Abstract: With the emergence of global regulatory frameworks centred around clinical trial transparency, such as EMA Policy 0070 and Health Canada’s “Public Release of Clinical Information” guidance document, the sharing of anonymized clinical study reports (CSRs) for transparency purposes has grown exponentially since the beginning of the previous decade. In order to build a scalable, automated architecture to quickly turn around such data sharing requests, several challenges related to the anonymization of CSR data must be addressed. These challenges make this problem quite unique in the privacy engineering landscape and require novel solutions to both efficiently anonymize the data and meet the myriad regulatory requirements associated to public data releases. This presentation will discuss the regulatory context of clinical trial transparency and enumerate the challenges and practical considerations around architecting and scaling a machine learning-driven pipeline for anonymizing CSR data for public release.

Building for Digital HealthWatch Recording

Download Presentation Slides

  • Aditi JoshiSecurity and Privacy Engineering, Google Cloud
  • Dr. Oliver Aalami, Vascular Surgeon at Stanford Medicine and Director of Digital Health at Stanford Biodesign
  • Abstract: Sharing information across systems is critical and the global pandemic has underscored the importance of interoperability in health. Open source frameworks on iOS or Android can leverage backend Cloud infrastructure which has made the development of digital health applications easier. With this ease comes a great sense of responsibility to create guidelines for privacy preserving practices. Today, we want to highlight the work being done at Stanford in creating privacy-preserving systems/digital health applications that exist in this rich ecosystem where a diverse set of stakeholders operate. Our goal is to offer a set of guidelines for other organizations who are trying to build digital health applications whether they choose to leverage Stanford’s open source Cardinal Kit framework or not. We would also love to get your feedback on how to make it better.

12:30 pm –
12:45 pm

BREAK

12:45 pm –
1:45 pm

SESSION 7 – PANEL DISCUSSION

Reconciling Privacy Engineering Scholarship with Industry Needs for Privacy Engineering Watch Recording

  • Moderator: Dr. Sara Jordan, Future of Privacy Forum
  • Panel: Stacey TruexGeorgia Tech, Frank PallasTU-Berlin, Sheila ColclasureKinesso, Lawrence YouGoogle
  • Abstract: How can privacy engineering scholars engage with the practice of privacy? How can privacy professionals guide privacy engineering scholars toward new research areas or new data? In this panel, we convene both academics and industry experts to discuss how both academics and industry players can support one another to make privacy work.

1:45 pm –
2:25 pm

NETWORKING BREAK

Birds of a Feather Topics:

  • Privacy Predictions
  • Foundational Versus Retrofitting Privacy Engineering
  • Gender And Privacy In Design And Development
  • Mechanics Of Building A Privacy Team Within An Entity. Who Do You Hire First? What’s The Best Ratio Of Legal To Engineer To Support Staff? How Do You Build A Team Stepwise? Do You Organize By Product Line, By Privacy Regime/Region?
  • Women In Privacy
  • Privacy Taxonomies

2:25 pm –
3:40 pm

SESSION 8 – ARCHITECTURES FOR PRIVACY

Moderator: Lea Kissner, Twitter

No Threat, No Sweat: Privacy Threat Modeling in Practice – a Machine Learning Use Case Watch Recording

Download Presentation Slides

  • Kim Wuytsimec-Distrinet, KU Leuven; Isabel BarberáBitnessWiseBV
  • Abstract: ‘Privacy. What can go wrong?’ A simple, yet essential question you should ask yourself when you are assessing the privacy of a system. Threat modeling is a known approach to systematically identify and mitigate security and privacy threats in software architectures. Several techniques, processes, and knowledge bases exist and have shown their value. But how can you make privacy threat modeling really work in practice? In this talk, we will first discuss the concept of privacy threat modeling, with a focus on the lightweight approach LINDDUN GO. In the second part of the talk, we will look at the practical application of LINDDUN GO in a Machine Learning environment where the generic concepts are being adapted to fully support the project-specific requirements.

Lightweight Purpose Justification Service for Embedded Accountability Watch Recording

Download Presentation Slides

  • Arnav Jagasia, Yeong WeePalantir Technologies
  • Abstract: Ensuring the adherence and accountability of purpose limitation policies remains one of the more difficult data protection principles to uphold in modern big data platforms. Such systems are usually designed to be flexible and agnostic to the user’s specific workflows, and common processes to ensure purpose limitation – such as legal reviews or privacy impact assessments – tend to be one-off exercises that are disconnected from the everyday use of the platform. In this talk, we discuss our learnings and challenges in designing and deploying a solution to require users to justify and contextualize sensitive actions they may take in a big data platform. We also cover how governance teams review these justifications to understand how their data platforms are used in practice. We believe such technical approaches to supplement the often non-technical methods for managing purpose limitation can improve overall data governance oversight.

Deleting the undeletable Watch Recording

Download Presentation Slides

  • Behrooz ShafieeShopify
  • Abstract: This talk will describe a recent multi-year effort at Shopify to refactor our analytics event collection platform, in order to better facilitate data subject requests. In it, we will explain how we architectured our system, using simple techniques to classify, track, report on, and delete personal information embedded in the tens of billions of daily events that power our analytics engine. We’ll talk about the close collaboration between privacy and data teams to make the default the right/easy thing to do, and how we addressed our multi-year historical privacy backlog. We’ll share some insight into the hardest challenges and surprises we had to overcome.

3:40 pm –
4:00 pm

BREAK

4:00 pm –
5:15 pm

SESSION 9 – PRIVACY FOR VULNERABLE POPULATIONS

Moderator: Diane Hosfelt

Considering Privacy when Communicating with Incarcerated People Watch Recording

Download Presentation Slides

  • Kentrell OwensUniversity of Washington
  • Abstract: Surveillance of communication between incarcerated and non-incarcerated people has steadily increased, enabled partly by technological advancements. Private companies control communication tools for most U.S. prisons and jails and offer surveillance capabilities beyond what individual facilities could realistically implement. Frequent communication with family improves mental health and post-carceral outcomes for incarcerated people, but we wondered: does discomfort about surveillance affect how, when, or if their relatives communicate with them? To explore these questions, we conducted 16 semi-structured interviews with participants who have incarcerated relatives in Pennsylvania. Among other findings, we learned that participants communicate despite privacy concerns that they felt helpless to address. We also observed inaccuracies in participants’ beliefs about surveillance practices. In addition to sharing some findings from participants, I will discuss implications of inaccurate understandings of surveillance, misaligned incentives between end-users and vendors, recommendations for more privacy-sensitive communication tools, and lessons for privacy design for marginalized communities.

Security through Care: Abusability Insights from Tech Abuse AdvocatesWatch Recording

Download Presentation Slides

  • Julia SlupskaOxford Internet Institute, Centre for Doctoral Training in Cybersecurity, University of Oxford
  • Abstract: Gendered harassment, abuse, and surveillance was ignored in traditional cybersecurity, but has recently become a significant focus of research. Many solutions—which focus on improving survivors’ digital privacy and security practices—impose an unfair burden of “safety work” on those already made vulnerable by abuse. Drawing on interviews with advocates who support survivors of tech abuse, such as counsellors in domestic violence shelters, Julie Slupska will argue these advocates are (often un-acknowledged) cybersecurity workers and experts. In defending against abuse and supporting healing after the trauma it causes, advocates have developed alternative security practices that incorporate principles of consent, empowerment, and trauma-informed care. Their ideas for how technology designers could anticipate, mitigate, and respond to online abuse move beyond notions of digital security as technical security. This talk will outline lessons we can learn from these practices as well as how privacy practitioners and engineers can better support their work.

If at First You Don’t Succeed: Norway’s Two Contact-Tracing Apps Watch Recording

Download Presentation Slides

  • Eivind ArvesenSector Alarm
  • Abstract: Norway was one of the first European countries that launched a COVID-19 app in 2020, but in the span of a few months it was criticized by an independent committee, the development community at large and Amnesty international – before gaining international notoriety and finally being shut down by the Norwegian DPA. Shortly thereafter, the Norwegian government decided that a new app should be based on the Google Apple Exposure Notification System, and the Norwegian Institute of Public Health published a Request For Proposal. The winner (and only bidder!) was the Danish company Netcompany, which already had a similar contract with the Danish health authorities. I will discuss and compare the degree of transparency, collaboration with the community and architecture between the two apps (including assessing the privacy implications, vulnerabilities, trade-offs and alternatives, etc).

5:15 pm –
5:45 pm

SOCIAL EVENT

Leadership And Structure Of Privacy

Teams Consent As Privacy Engineering

Data Ownership, And How To Move Away From Monolithic Data Systems

Measurement & Making The Business Case For Privacy

Privacy Engineering: Employee Training

Ask An NSF Program Officer Anything

Research Data Sharing: Challenges And Opportunities

Privacy Predictions

Foundational Versus Retrofitting Privacy Engineering

Gender And Privacy In Design And Development

Building Privacy Teams

Women In Privacy

Privacy Taxonomies

session1_Privacy at Scale

session2_Consent

session3_Ethics Panel

session4_Data Deletion

session5_Design

session6_Healthcare Privacy

session7_Privacy Panel

session8_Architectures for Privacy

session9_Privacy for Vulnerable Populations

Jobs

And More….

Speakers

Dr. Oliver Aalami

Vascular Surgeon at Stanford Medicine and Director of Digital Health at Stanford Biodesign, Stanford University

Session 6: Healthcare Privacy

Dr. Oliver Aalami is a Clinical Associate Professor of Vascular & Endovascular Surgery at Stanford University and the Palo Alto VA and serves as the Director of Digital Health of Stanford Biodesign. He is the course director for Biodesign for Digital Health as well as Building for Digital Health and one of the co-founders of the open-source project, CardinalKit which we are presenting today. His primary research focuses on clinically validating the sensors in smartphones and smartwatches in patients with cardiovascular disease to unlock the potential for precision health.

Megha Arora

Data Scientist, Privacy and Civil Liberties, Palantir Technologies

Session 1: Privacy at Scale

Megha Arora is a Data Scientist with the Privacy and Civil Liberties team at Palantir Technologies where she focuses on tailoring Palantir to solve the data and analytical problems of its customers in a privacy preserving way. She holds a Master’s degree in Computer Science from Carnegie Mellon University. Megha has led her own non-profit initiatives and data-driven projects for the Indian government and organizations like CERN. She cares deeply about incorporating fairness, transparency and accountability in Machine Learning applications, and safeguarding digital information in our increasingly data-rich societies.

Eivind Arvesen

Group Cyber Security Manager, Sector Alarm

Session 9: Privacy for Vulnerable Populations

Eivind Arvesen is a security and privacy professional who currently works as Group Cyber Security Manager at one of Europe’s leading providers of monitored home security solutions.
He holds a master’s degree in computer science (with a focus on machine learning), and has previously worked as software developer and architect, both in-house and as a consultant, on projects ranging from product MVPs to critical infrastructure.
In 2020, he was part of a government appointed expert panel tasked with evaluating the Norwegian COVID-19 app.

Isabel Barberá

Founder, BitnessWise

Session 8: Architectures for Privacy

Isabel Barberá is data privacy and security specialist and the founder of consultancy firm BitnessWise in The Netherlands. With almost 20 years experience in IT and software development, she has spent the last 6 years assisting organisations from the public and private sector in developing and implementing comprehensive data privacy and security programs with a special focus on privacy by design.

Benjamin Brook

CEO, Transcend

Session 2: Consent

Ben Brook is the CEO and Co-Founder of Transcend, a data privacy technology company headquartered in San Francisco. Backed by Accel and Index Ventures, Transcend makes it simple for companies to give users control over their data, and is building engineering solutions to chart the path forward for modern data rights. Prior to co-founding Transcend, Ben studied computer science, astrophysics, and neuroscience at Harvard University.

Rumman Chowdhury

META Director, Twitter

Session 3: Respectful Ethics in Practice

Dr. Rumman Chowdhury’s passion lies at the intersection of artificial intelligence and humanity. She is a pioneer in the field of applied algorithmic ethics, working with C-suite clients to create cutting-edge technical solutions for ethical, explainable and transparent AI since 2017.

She is currently the Director of the ML Ethics, Transparency and Accountability (META) team at Twitter. Prior to joining Twitter, she was  CEO and founder of Parity, an enterprise algorithmic audit platform company. She formerly served as Global Lead for Responsible AI at Accenture Applied Intelligence.

Rumman has been featured in international media, including the Wall Street Journal, Financial Times, Harvard Business Review, NPR, MIT Sloan Magazine, MIT Technology Review, BBC, Axios, Cheddar TV, CRN, The Verge, Fast Company, Quartz, Corrierre Della Serra, Optio, Australian Broadcasting Channel and Nikkei Business Times.

As service to the field and the larger community, she serves on the board of Oxford University’s Commission on AI and Governance, the University of Virginia’s Data Science Program, and Patterns, a data science journal by the publishers of Cell.

Dr. Chowdhury holds two undergraduate degrees from MIT, a master’s degree in Quantitative Methods of the Social Sciences from Columbia University, and a doctorate in political science from the University of California, San Diego.

Sheila Colclasure

Global Chief Digital Responsibility & Public Policy Officer, Kinesso

Session 7: Panel Discussion

As Global Chief Digital Responsibility and Public Policy Officer, Sheila leads the global data policy and digital responsibility strategies for Kinesso, ensuring that data and digital technology are used ethically and accountably across the enterprise and with IPG clients. This means ensuring data and tech are used in ways that serve people. She helps ensure practices operating at the leading edge of digital technology are consistent with principles of responsible, respectful, proportionate and fair data use. Sheila is responsible for public policy engagement with regulators, policy groups, clients and other key stakeholders globally, advocating for ethical advertising and marketing practices, in ways that earn trust. She is an advisor on the development and deployment of Kinesso’s data-driven and digital solutions and services. She is a trusted thought partner, advisor, and reputational champion for IPG companies.

Ms. Colclasure is a recognized global thought leader on applied data ethics, accountable data governance and human-centered digital responsibility. Sheila has extensive knowledge of laws and societal expectations governing the collection and use of information, with particular depth in the rapidly evolving data-driven advertising and marketing ecosystem and ethical AI. She is continuously sought out by policy makers, regulators and government agencies for her views on data integrity and how to address the complexity of operationalizing and harmonizing next-generation data governance for the global digital data-driven ecosystem. Sheila is a Presidential Leadership Scholar and was recognized by CSO as one of the “12 amazing women in security” (2017.)

She is a frequent speaker and media interviewee and has advanced data leadership and policy with the marketplace, regulators and lawmakers in many fora, including the U.S. HHS Datapalooza, Attorney General Alliance, Dublin Tech Summit, Global Data Transparency Lab, Information Accountability Foundation (IAF) Digital University for Regulator Series, and Ibero-American Data Protection Network. Sheila has presented key talks at global events for the Consumer Electronics’ Show, Forrester, adExchanger, International Association of Privacy Professionals, Healthcare Information and Management Systems Society, Digital Advertising Alliance, OutSell DataMoney, ShopTalk, Philly Phorum, American Bar Association and the Marketing Sciences Institute.

Sheila serves on the advisory board of the IAF and is corporate liaison to several industry standards-setting groups.

Prior to joining IPG Kinesso, she was the Acxiom Global Chief Data Ethics Officer and Public Policy Executive, Manager of Congressional and Political Affairs for the American Institute of Certified Public Accountants in Washington, D.C., and Staff Assistant in the U.S. Senate. Sheila has a master’s degree in communications, specializing in business and political communication.

Lorrie Cranor

Director and Bosch Distinguished Professor, CyLab Security and Privacy Institute, FORE Systems Professor of Computer Science and of Engineering and Public Policy, Carnegie Mellon University

Session 2: Consent | Session 5: Design

Lorrie Faith Cranor @lorrietweet is the Director and Bosch Distinguished Professor of the CyLab Security and Privacy Institute and FORE Systems Professor of Computer Science and of Engineering and Public Policy at Carnegie Mellon University. She also directs the CyLab Usable Privacy and Security Laboratory (CUPS) and co-directs the MSIT-Privacy Engineering masters program. In 2016 she served as Chief Technologist at the US Federal Trade Commission. She co-founded Wombat Security Technologies, a security awareness training company that was acquired by Proofpoint. She is a fellow of the ACM, IEEE, and AAAS, and a member of the ACM CHI Academy. She plays soccer and bass flute, designs quilts, and photographs interesting bathrooms.

David Di Valentino

Data Scientist, Clinical Trial Transparency (CTT), Privacy Analytics

Session 6: Healthcare Privacy

David Di Valentino, PhD is a Data Scientist in the Clinical Trial Transparency (CTT) team at Privacy Analytics. David applies his expertise in data science and data analysis to help clinical trial sponsors maximize the analytical value of their data, while meeting global standards for patient privacy.

Yuanyuan Feng

Postdoctoral Researcher, Carnegie Mellon University

Session 2: Consent

Yuanyuan Feng is a postdoctoral researcher at Carnegie Mellon University. Her research is centered around human-computer interaction and usable privacy & security. As part of her research, she also designs and implements human-centered privacy tools to help people take better control of their data privacy in the digital world.

Michelle Finneran

Session 3: Ethics

Michelle Finneran Dennedyis a founder, well-trusted advisor and industry expert increasing awareness around data privacy policies and the tools that provide privacy protections. Most recently, she co-founded Privatus Consulting – a firm that helps businesses accelerate the process of data privacy strategy & ESG metrics. She is also the CEO & co-founder of a Privacy Engineering tools company, currently in stealth mode! Prior to that, Michelle held several leadership roles, including VP & Chief Privacy Officer, leading security and privacy initiatives focused around regulatory compliance and privacy engineering at organizations like Cisco, McAfee – Intel Security, Oracle, and Sun Microsystems. Michelle co-authored The Privacy Engineer’s Manifesto: Getting from Policy to Code to QA to Value – which takes a look at the challenges and opportunities raised within the emerging Consent Capital economy.
Connect with Michelle at:
Twitter @mdennedy

Arnav Jagasia

Privacy and Civil Liberties Engineer, Palantir Technologies

Session 8: Architectures for Privacy

Arnav Jagasia is a Privacy and Civil Liberties Engineer at Palantir Technologies, where he works on designing and implementing privacy-protective technology. He holds a Masters degree in Computer Science from the University of Pennsylvania, where his research focused on the application of US constitutional law to facial recognition and surveillance technologies. He also holds BSE in Computer Science and a BS in Economics from the University of Pennsylvania.

Aditi Joshi

Google Cloud Security & Privacy Engineering

Session 6: Healthcare Privacy

Aditi Joshi is embedded in the Google Cloud Security & Privacy Engineering organization. She is passionate about helping developers build digital health applications in a privacy preserving manner and is committed to democratizing healthcare. Aditi’s stints include a fellowship at Yale Law School and as a Berkman Fellow at Harvard where she focused on data privacy.

Lea Kissner

Head of Privacy Engineering, Twitter

Session 3: Respectful Ethics in Practice

Lea is the Head of Privacy Engineering at Twitter. They work to build respect for users into products and systems through product design, privacy-enhancing infrastructure, application security, and novel research into both theoretical and practical aspects of privacy. They were previously the Global Lead of Privacy Technology at Google, Chief Privacy Officer of Humu, and at Apple, working for over a decade on projects including logs anonymization, infrastructure security, privacy infrastructure, and privacy engineering. They earned a Ph.D.in computer science (with a focus on cryptography) at Carnegie Mellon University and a BS in electrical engineering and computer science from UC Berkeley.

Muqun Li

Senior Machine Learning Engineer, Privacy Analytics

Session 6: Healthcare Privacy

Muqun (Rachel) Li, PhD is a Senior Machine Learning Engineer at Privacy Analytics. Rachel employs her machine learning and engineering expertise to conceive, build and refine the tools used by the CTT client services team. These tools help trial sponsors meet their obligations for transparency and privacy.

Joshua O'Madadhain

Google

Session 1: Privacy at Scale

Joshua O’Madadhain worked for 10 years on Google infrastructure before joining its privacy team in 2018 to lead its efforts for infrastructure. Prior to his work at Google, he worked as an applied researcher at Microsoft on topics in user protection (spam, phishing, botnets); prior to that, he did research on predictive models for social networks at UC Irvine. Joshua’s interests in privacy include making privacy happen at scale, and approaching it as an area of applied ethics.

Kentrell Owens

Computer Science and Engineering PhD Student, Security & Privacy Research Lab, University of Washington

Session 9: Privacy for Vulnerable Populations

Kentrell Owens is a first year Computer Science and Engineering PhD student at the University of Washington in the Security & Privacy Research Lab and is advised by Dr. Franziska Roesner and Dr. Tadayoshi Kohno. He recently earned his M.S. in electrical and computer engineering from Carnegie Mellon University, where he worked with Dr. Lorrie Cranor and Dr. Camille Cobb. His research area is computer security and privacy, and he’s specifically interested in the computer security and privacy needs of marginalized communities.

Frank Pallas

Senior Researcher, TU Berlin

Session 7: Panel Discussion

Frank Pallas is a senior researcher with the Information Systems Engineering Research Group, TU Berlin, Germany. His research interests particularly include interdisciplinary aspects of privacy and security in cloud-, fog-, and web-based computing as well as in the IoT. He received the Ph.D. degree with distinction in computer science from TU Berlin.

Mihir Patil

Software Development Lead, Privacy and Civil Liberties, Palantir Technologies

Session 1: Privacy at Scale

Mihir Patil is a Software Development Lead on the Privacy and Civil Liberties team at Palantir Technologies, where he builds and deploys privacy-protective products for essential organizations that Palantir serves. In addition to his technical endeavors, Mihir is the instructor of the Introduction to Privacy Engineering Udacity course to help embed the values of privacy engineering within the engineering community. He holds a B.S. in Electrical Engineering & Computer Science from the University of California at Berkeley.

Sarah Pearman

PhD Student, Institute for Software Research, Carnegie Mellon University

Session 6: Healthcare Privacy

Sarah Pearman is a PhD student in Societal Computing in the Institute for Software Research at Carnegie Mellon University. Sarah’s primary research area is usable privacy and security. She is especially interested in data use disclosures, opt-out mechanisms, and privacy controls that help users make informed and meaningful choices about their data.

Nandita Rao

Sr Privacy Program Manager, DoorDash

Session 4: Data Deletion

Nandita Rao is a Sr Privacy Program Manager at DoorDash, where she works on all things privacy. Previously, she helped Fortune 500 companies build Privacy, Cybersecurity, and Information Governance programs. She was also part of a data profiling startup that helps solve privacy compliance challenges. She holds an MS in Information Security from Carnegie Mellon University, where privacy first piqued her interest.

Benoît Reitz

Engineering Manager, Deletion Systems, Facebook

Session 4: Data Deletion

Benoît Reitz is an engineering manager supporting Facebook’s Deletion Systems team, in which he previously worked as an engineer. His focus was to build the infrastructure required to run graph deletions, ultimately enabling Facebook to uphold its commitment to its user’s right to be forgotten. He holds a Masters in Computer Science from Epita.

Neville Samuell

VP of Engineering, Ethyca

Session 4: Data Deletion

Neville Samuell is an engineer, entrepreneur, and lifelong learner with a never-ending curiosity for building teams, companies, websites, apps, and robots. As VP of Engineering at Ethyca, Neville leads a fully distributed team of developers and testers building out Ethyca’s data privacy automation platform, which helps businesses of all sizes efficiently honor privacy rights for their customers. In his previous roles as VP of Engineering at Degreed and CTO of Pathgather, he led the development of enterprise SaaS product used by millions of employees worldwide to measure and improve their learning and skills. A Waterloo Mechatronics Engineering graduate, now Neville’s on a mission to make respect for users an integral part of business operations and software development everywhere.

Andreas Schou

Staff Privacy Engineer, Privacy and Data Protection Officer, Google

Session 3: Ethics

Andreas Schou is a Staff Privacy Engineer in the Privacy and Data Protection Office at Google, focusing on Google’s abuse and machine learning systems. His work centers on preventing memorization and disclosure of user data by machine learning models and minimizing exceptional data handling in production abuse systems.

Sunny Seon Kang

Data Privacy Attorney

Session 1: Privacy at Scale

Sunny Seon Kang is a data privacy attorney. She was previously Senior Privacy Counsel and Head of Policy at Inpher where she advised on U.S. and international data privacy laws, FinTech regulations, and AI policy. Prior to joining Inpher, Sunny was International Consumer Counsel at the Electronic Privacy Information Center (EPIC) and a Fellow at Stanford Law School. She has testified before the U.S. Consumer Product Safety Commission on IoT data security, filed FTC complaints against Facebook’s data practices, and advised regulatory agencies on GDPR implementation, algorithmic accountability, and privacy by design. Sunny has also worked on emerging technology and privacy matters under the former California Attorney General Kamala D. Harris. She holds degrees from University College London, UC Berkeley School of Law (LL.M. in Technology and IP), and Stanford Law School (J.S.M. in Juridical Science). She is a member of the New York bar.

William Seymour

Research Associate, King's College London

Session 5: Design

William Seymour is a research associate at King’s College London. His work focuses on privacy and security issues in smart homes, and he is currently working on an EPSRC funded project about developing secure AI assistants. He received his PhD in cyber security from the University of Oxford.

Behrooz Shafiee

Staff Privacy Engineer, Shopify

Session 8: Architectures for Privacy

Behrooz Shafiee is a staff privacy engineer at Shopify where he works on building scalable privacy tooling and helps teams to respect merchants’ and buyers’ privacy. He received his MSc in Computer Science at University of Waterloo in 2015. Outside of the binary world, he enjoys to be upside down (gymnastics), on a bike, on skis, or in the woods.
Twitter: @behroozshafiee

Manya Sleeper

Senior UX Researcher, Google

Session 5: Design

Manya Sleeper, Senior UX Researcher at Google, has worked on usable security and privacy at Google since 2016. Prior to joining Google, she completed a PhD at Carnegie Mellon University focused on usable online sharing. Her academic research has focused on a range of usable security and privacy topics, including at-risk user needs, security warnings, privacy & security, online sharing and decision making, and online security warnings.

Julia Slupska

Doctoral Student, Centre for Doctoral Training in Cybersecurity, Oxford Internet Institute

Session 9: Privacy for Vulnerable Populations

Julia Slupska is a doctoral student at the Centre for Doctoral Training in Cybersecurity and the Oxford Internet Institute. Her research focuses on how cybersecurity concepts and practices can address technologically-mediated abuse like image-based sexual abuse (or ‘revenge porn’) and stalking. She is also exploring how feminist theories and methods—such as participatory action research and the ethics of care—can improve cybersecurity.

Stacey Truex

Georgia Institute of Technology

Session 7: Panel Discussion

Stacey is a fifth year PhD student in the School of Computer Science at the Georgia Institute of Technology’s College of Computing. I am a graduate researcher in the Distributed Data Intensive Systems Lab directed by Professor Ling Liu where my research focuses on two complementary perspectives: (1) privacy, security, and trust in machine learning models and algorithmic decision making, and (2) secure, privacy-preserving artificial intelligence systems, services, and applications. My PhD research has been generously supported through the IBM PhD Fellowship, the Microsoft Research Women’s Fellowship, and the Georgia Tech Presidential, Haley, and Institute for Information Security & Privacy Cybersecurity Fellowships.  Prior to starting at Georgia Tech, I received my Bachelor’s degree in Computer Science (B.S.) and Mathematics (B.A.) from Wake Forest University. I then spent some time in industry as a software developer before returning to school at the University of Washington where I received my Master’s of Computer Science and Systems. While pursuing my Master’s I was also a researcher in the University of Washington Institute of Technology’s Center for Data Science as a member of both the Healthcare Analytics and Security research groups.

Yeong Wei Wee

Privacy and Civil Liberties Engineer, Palantir Technologies

Session 8: Architectures for Privacy

Yeong Wei Wee is a Privacy and Civil Liberties Engineer at Palantir Technologies. Yeong has 8 years of experience focusing on the intersection of policy and technology, where he is dedicated to the technical implementation of privacy-enhancing technologies. He works in close collaboration with engineers and clients, and his project engagements have spanned more than a dozen industries across Europe and the United States. Yeong holds a BA summa cum laude in Computer Science and East Asian Studies from the University of Pennsylvania.

Johanna Woll

Senior Staff UX Writer, Google

Session 5: Design

Johanna Woll, Senior Staff UX Writer at Google, has been developing messaging frameworks and content for consent components, user data controls, and privacy products at Google since 2018. She has a background in content strategy, journalism, and library science and worked in startup, corporate, and academic roles before joining Google.

Kim Wuyts

Postdoctoral Researcher, imec-DistriNet, KU Leuven

Session 8: Architectures for Privacy

Kim Wuyts is a postdoctoral researcher at the imec-DistriNet research group at KU Leuven in Belgium. She has more than 10 years of experience in security and privacy in software engineering. Her research primarily focuses in the domain of privacy threat modeling. Kim is one of the driving forces behind the development and extension of LINDDUN, a privacy threat modeling framework. She is also a co-author of the Threat Modeling Manifesto.

Lawrence You

Director of Privacy for Product & Engineering, Google

Session 7: Panel Discussion

Lawrence You is Google’s Director of Privacy for Product and Engineering since 2013 and a Distinguished Privacy Engineer. He joined Google’s Systems Infrastructure group in 2004, leading the logs storage and analysis team. He has been an advisor to product, engineering, policy, and legal teams across Google on matters related to large-scale data analysis, data management, security, and privacy technology. Currently he drives privacy initiatives within Google, strategic technical direction, and works with a team of Privacy Engineers to establish design and development practices.

Prior to Google, Lawrence’s research was in scalable, space-efficient archival storage systems. As a software engineer, he has developed mobile and embedded software platforms and software development tools at Pixo, Metrowerks, Taligent, and Apple. Lawrence holds a PhD in computer science from University of California, Santa Cruz; an MS in computer science, and BS in electrical engineering from Stanford University.

Gary Young

Google

Session 1: Privacy at Scale

Gary Young has been a software engineer at Google since 2007, and has been deeply involved in the privacy space for almost all of that time; he is now one of Google’s senior technical leads in privacy across several areas spanning policy and practice.

Sponsored by

BECOME A SPONSOR: Sponsorship exposes your brand to highly qualified attendees, supports open access to our conference content, and keeps PEPR conferences affordable. To learn more, please contact us at [email protected].

The acceptance of any organization as a sponsor does not imply explicit or implicit approval by PEPR of the donor organization’s values or actions. In addition, sponsorship does not provide any control over conference program content.

2021 pepr sponsorship info sheet 1 3 21 final v.2