All Times on this page are UTC+4 (Dubai, Muscat)

Other timezones can be found here: UTC, UTC+1, UTC+2, UTC+3, UTC+4, UTC+5, UTC+6, UTC+7, UTC+8, UTC+9, UTC+10, UTC+11, UTC+12, UTC-1, UTC-2, UTC-3, UTC-4, UTC-5, UTC-6, UTC-7, UTC-8, UTC-9, UTC-10, UTC-11, UTC-12

Conference Schedule

17:15

Welcome to PETS Online (Rochester Room)

17:30

Opening remarks (Rochester Room)

17:40

Mini-Break

17:50

Session 1A: Private Machine Learning

Chair: Esfandiar Mohammadi 17:50–19:20 Scaling up Differentially Private Deep Learning with Fast Per-Example Gradient Clipping artifact Jaewoo Lee (University of Georgia) and Daniel Kifer (Penn State University) DPlis: Boosting Utility of Differentially Private Deep Learning via Randomized Smoothing Wenxiao Wang (Tsinghua university), Tianhao Wang (Harvard University), Lun Wang (University of California, Berkeley), Nanqing Luo (Huazhong University of Science and Technology), Pan Zhou (Huazhong University of Science and Technology), Dawn Song (University of California, Berkeley), and Ruoxi Jia (Virginia Tech) Differentially Private Naive Bayes Classifier using Smooth Sensitivity Farzad Zafarani (Purdue University) and Chris Clifton (Purdue University) privGAN: Protecting GANs from membership inference attacks at low cost to utility Sumit Mukherjee (Microsoft), Yixi Xu (Microsoft), Anusua Trivedi (Microsoft), Nabajyoti Patowary (Microsoft), Juan Lavista Ferres (Microsoft)

Session 1B: Cryptography

Chair: Sherman Chow 17:50–19:20 Faster homomorphic comparison operations for BGV and BFV Ilia Iliashenko (imec-COSIC, KU Leuven, Belgium) and Vincent Zucca (Équipe DALI, Université de Perpignan via Domitia, France; LIRMM, UMR 5506, Université de Montpellier, CNRS, France) Controlled Functional Encryption Revisited: Multi-Authority Extensions and Efficient Schemes for Quadratic Functions Miguel Ambrona (NTT Secure Platform Laboratories), Dario Fiore (IMDEA Software Institute), and Claudio Soriente (NEC Laboratories Europe) Private Stream Aggregation with Labels in the Standard Model Johannes Ernst (University St. Gallen) and Alexander Koch (Karlsruhe Institute of Technology) Mercurial Signatures for Variable-Length Messages Elizabeth C. Crites (University College London) and Anna Lysyanskaya (Brown University)

Session 1C: Privacy Policies

Chair: Veelasha Moonsamy 17:50–19:20 Automated Extraction and Presentation of Data Practices in Privacy Policies artifact Duc Bui (University of Michigan), Kang G. Shin (University of Michigan), Jongmin Choi (Samsung Research), and Jun Bum Shin (Samsung Research) Data Portability between Online Services: An Empirical Analysis on the Effectiveness of GDPR Art. 20 Emmanuel Syrmoudis (Technical University of Munich), Stefan Mager (Ludwig-Maximilians-University of Munich), Sophie Kuebler-Wachendorff (Ludwig-Maximilians-University of Munich), Paul Pizzinini (Ludwig-Maximilians-University of Munich), Jens Grossklags (Technical University of Munich), and Johann Kranz (Ludwig-Maximilians-University of Munich) Privacy Preference Signals: Past, Present and Future Maximilian Hils (University of Innsbruck), Daniel Woods (University of Innsbruck), Rainer Böhme (University of Innsbruck), and Maximilian Hils (University of Innsbruck) Unifying Privacy Policy Detection Henry Hosseini (University of Münster), Martin Degeling (Ruhr University Bochum), Christine Utz (Ruhr University Bochum), and Thomas Hupperich (University of Münster

19:20

Poster Session 1 and Break

19:20–20:00

20:00

Session 2A: Data Privacy I

Chair: Chris Clifton 20:00–21:30 Differential Privacy at Risk: Bridging Randomness and Privacy Budget Ashish Dandekar (École Normale Supériure, Paris, France), Debabrota Basu (Chalmers University of Technology, Gothenberg, Sweden), Stéphane Bressan (National University of Singapore, Singapore), and Ashish Dandekar (École Normale Supérieure) Growing synthetic data through differentially-private vine copulas artifact Sébastien Gambs (UQAM), Frédéric Ladouceur (Ericsson), Antoine Laurent (UQAM), and Alexandre Roy-Gaumond (UQAM) SoK: Managing Longitudinal Privacy of Publicly Shared Personal Online Data Theodor Schnitzler (Ruhr-Universität Bochum), Shujaat Mirza (Courant Institute of Mathematical Sciences, New York University), Markus Dürmuth (Ruhr-Universität Bochum), and Christina Pöpper (New York University Abu Dhabi) DyPS: Dynamic, Private and Secure GWAS Túlio Pascoal (SnT, University of Luxembourg), Jérémie Decouchant (SnT, University of Luxembourg), Antoine Boutet (Insa-Lyon, CITI, Inria), and Paulo Esteves-Verissimo (SnT, University of Luxembourg)

Session 2B: Multiparty Private Machine Learning I

Chair: Arkady Yerukhimovich 20:00–21:30 SoK: Privacy-Preserving Collaborative Tree-based Model Learning Sylvain Chatel (EPFL), Apostolos Pyrgelis (EPFL), Juan Ramón Troncoso-Pastoriza (EPFL), and Jean-Pierre Hubaux (EPFL) Secure Training of Decision Trees with Continuous Attributes Mark Abspoel (CWI, The Netherlands), Daniel Escudero (Aarhus University, Denmark), and Nikolaj Volgushev (Pleo Technologies ApS) SoK: Privacy-Preserving Computation Techniques for Deep Learning José Cabrero-Holgueras (CERN) and Sergio Pastrana (Universidad Carlos III de Madrid) Falcon: Honest-Majority Maliciously Secure Framework for Private Deep Learning artifact Sameer Wagh (Princeton University), Shruti Tople (Microsoft Research), Fabrice Benhamouda (Algorand Foundation), Eyal Kushilevitz (Technion), Prateek Mittal (Princeton University), and Tal Rabin (Algorand Foundation)

Session 2C: Privacy Preferences

Chair: Wendy Seltzer 20:00–21:30 Validity and Reliability of the Scale Internet Users’ Information Privacy Concerns (IUIPC) Thomas Gross (Newcastle University) "Warn Them" or "Just Block Them"?: Comparing Privacy Concerns of Older and Working Age Adults Hirak Ray (University of Maryland, Baltimore County), Flynn Wolf (University of Maryland, Baltimore County), Ravi Kuber (University of Maryland, Baltimore County), and Adam J. Aviv (The George Washington University) "Did you know this camera tracks your mood?": Modeling People's Privacy Expectations and Preferences in the Age of Video Analytics Shikun Zhang (Carnegie Mellon University), Yuanyuan Feng (Carnegie Mellon University), Lujo Bauer (Carnegie Mellon University), Lorrie Cranor (Carnegie Mellon University), Anupam Das (North Carolina State University), and Norman Sadeh (Carnegie Mellon University) "I would have to evaluate their objections": Privacy tensions between smart home device owners and incidental users Camille Cobb (Carnegie Mellon University), Sruti Bhagavatula (Carnegie Mellon University), Kalil Anderson Garrett (Carnegie Mellon University), Alison Hoffman (Carnegie Mellon University), Varun Rao (Carnegie Mellon University), and Lujo Bauer (Carnegie Mellon University)

21:30

Poster Session 2 and Break

21:30–22:10

21:45

PETS Happy Hour 21:45–23:00

Welcome to PETS meet-and-greet Host: Susan McGregor, Ford Foundation Room Karaoke and Open Mic! Karaoke Lounge in the Ford Foundation Garden Privacy-Themed Movie Night (The Great Hack) Ford Foundation Garden Open Unconference Sessions Ford Foundation Room Games Hang out in the Game Zone

5:00 (July 13)

PETS After Dark (in the US) Poster Session & Mingle

Ask Me Anything (AMA) with Ian Goldberg! AMA room in the Ford Foundation Garden Privacy-Themed Movie Night (The Great Hack) Ford Foundation Garden Open Unconference Sessions Ford Foundation Room PETS Rewind and Poster Session Games Hang out in the Game Zone

17:30

Session 3A: Data Privacy II

Chair: Carmela Troncoso 17:30–19:00 Privacy-Preserving Multiple Tensor Factorization for Synthesizing Large-Scale Location Traces with Cluster-Specific Features artifact Takao Murakami (AIST), Koki Hamada (NTT), Yusuke Kawamoto (AIST), and Takuma Hatano (NSSOL) Face-Off: Adversarial Face Obfuscation artifact Varun Chandrasekaran (University of Wisconsin-Madison), Chuhan Gao (Microsoft), Brian Tang (University of Wisconsin-Madison), Kassem Fawaz (University of Wisconsin-Madison), Somesh Jha (University of Wisconsin-Madison), and Suman Banerjee (University of Wisconsin-Madison) FoggySight: A Scheme for Facial Lookup Privacy Ivan Evtimov (University of Washington), Pascal Sturmfels (University of Washington), and Tadayoshi Kohno (University of Washington) On the (Im)Practicality of Adversarial Perturbation for Image Privacy Arezoo Rajabi (Oregon State University), Rakesh B. Bobba (Oregon State University), Mike Rosulek (Oregon State University), Charles V. Wright (Portland State University), and Wu-chi Feng (Portland State University)

Session 3B: Multiparty Private Machine Learning II

Chair: Sameer Wagh 17:30–19:00 SoK: Efficient Privacy-preserving Clustering Aditya Hegde (International Institute of Information Technology Bangalore), Helen Möllering (Technical University of Darmstadt), Thomas Schneider (Technical University of Darmstadt), and Hossein Yalame (Technical University of Darmstadt) Scalable Privacy-Preserving Distributed Learning David Froelicher (EPFL), Juan R. Troncoso-pastoriza (EPFL), Apostolos Pyrgelis (EPFL), Sinem Sav (EPFL), Joao Sa Sousa (EPFL), Jean-Philippe Bossuat (EPFL), and Jean-Pierre Hubaux (EPFL) Efficient homomorphic evaluation of k-NN classifiers Martin Zuber (CEA, LIST) and Renaud Sirdey (CEA, LIST) Privacy-Preserving Approximate k-Nearest-Neighbors Search that Hides Access, Query and Volume Patterns Alexandra Boldyreva (Georgia Institute of Technology) and Tianxin Tang (Georgia Institute of Technology)

Session 3C: Privacy Behaviors

Chair: Fabio Massacci 17:30–19:00 The Role of Privacy in Digitalization – Analyzing Perspectives of German Farmers Sebastian Linsner (Technical University of Darmstadt), Franz Kuntke (Technical University of Darmstadt), Enno Steinbrink (Technical University of Darmstadt), Jonas Franken (Technical University of Darmstadt), and Christian Reuter (Technical University of Darmstadt) Digital inequality through the lens of self-disclosure Jooyoung Lee (The Pennsylvania State University), Sarah Rajtmajer (The Pennsylvania State University), Eesha Srivatsavaya (The Pennsylvania State University), and Shomir Wilson (The Pennsylvania State University) "We, three brothers have always known everything of each other": A Cross-cultural Study of Sharing Digital Devices and Online Accounts Mahdi Nasrullah Al-Ameen (Utah State University), Huzeyfe Kocabas (Utah State University), Swapnil Nandy (Jadavpur University), and Tanjina Tamanna (University of Dhaka) The Motivated Can Encrypt (Even with PGP) Glencora Borradaile (Oregon State University), Kelsy Kretschmer (Oregon State University), Michele Gretes (Stand), and Alexandria LeClerc (Oregon State University)

19:00

Poster Session 3 and Break

19:00–19:40

19:40

Keynote: Shipping Privacy Enhancing Technologies to a Billion Devices (Rochester Room)

Erik Neuenschwander Chair: Julien Freudiger 19:40–21:10 Abstract: At Apple, we believe that privacy is a fundamental human right. This talk will discuss how Apple has innovated and shipped privacy enhancing technologies to a billion devices including Private Federated Learning, and the forthcoming iCloud Private Relay. Erik Neuenschwander is Apple’s Director of User Privacy, in charge of privacy engineering efforts across Apple’s products and services. Erik’s organization supports teams throughout the company to design amazing experiences with groundbreaking privacy protections, delivering features like Intelligent Tracking Prevention and Differential Privacy, as well as privacy-forward services like Apple News and Maps. The User Privacy team focuses on privacy by default, including data minimization, technical limits on data use, application of data protection, on-device processing, and privacy-preserving technologies. Erik has over eighteen years of experience in software technology including roles at Casio, Microsoft, and Apple. He holds a B.S. in Symbolic Systems and an M.A. in Philosophy from Stanford University and was a Teaching Fellow in Stanford’s Computer Science department.

21:10

Break

21:30

Town Hall (Rochester Room)

22:30

PETS Happy Hour

LGBTQIA+ Meet Up Hosts: Carmela Troncoso and Rebekah Overdorf, Ford Foundation Room Karaoke and Open Mic! Karaoke Lounge in the Ford Foundation Garden Virtual Escape Room Open Unconference Sessions Ford Foundation Room Games Hang out in the Game Zone

5:00 (July 14)

PETS After Dark (in the US) Poster Session & Mingle

Ask Me Anything (AMA) with Jeremy Epstein! AMA room in the Ford Foundation Garden Open Unconference Sessions Ford Foundation Room PETS Rewind and Poster Session Games Hang out in the Game Zone

17:30

Session 4A: Privacy Attacks

Chair: Athina Markopoulou 17:30–19:00 Genome Reconstruction Attacks Against Genomic Data-Sharing Beacons Kerem Ayoz (Bilkent University), Erman Ayday (Case Western Reserve University), and Ercument Cicek (Bilkent University) DNA Sequencing Flow Cells and the Security of the Molecular-Digital Interface Peter Ney (University of Washington), Lee Organick (University of Washington), Jeff Nivala (University of Washington), Luis Ceze (University of Washington), and Tadayoshi Kohno (University of Washington) Supervised Authorship Segmentation of Open Source Code Projects Edwin Dauber (Drexel University), Robert Erbacher (U.S. Army Research Laboratory), Gregory Shearer (ICF International), Michael Weisman (U.S. Army Research Laboratory), Frederica Nelson (U.S. Army Research Laboratory), and Rachel Greenstadt (New York University) Revisiting Membership Inference Under Realistic Assumptions Bargav Jayaraman (University of Virginia), Lingxiao Wang (University of California Los Angeles), Katherine Knipmeyer (University of Virginia), Quanquan Gu (University of California Los Angeles), and David Evans (University of Virginia)

Session 4B: Applied Cryptography I

Chair: Ryan Henry 17:30–19:00 CrowdNotifier: Decentralized Privacy-Preserving Presence Tracing Wouter Lueks (EPFL), Seda Gürses (TU Delft), Michael Veale (UCL), Edouard Bugnion (EPFL), Marcel Salathé (EPFL), Kenneth G. Paterson (ETHZ), and Carmela Troncoso (EPFL) EL PASSO: Efficient and Lightweight Privacy-preserving Single Sign On artifact Zhiyi Zhang (UCLA), Michał Król (City, University of London), Alberto Sonnino (UCL), Lixia Zhang (UCLA), and Etienne Riviere (UCLouvain) SGX-MR: Regulating Dataflows for Protecting Access Patterns of Data-Intensive SGX Applications A K M Mubashwir Alam (Marquette University), Sagar Sharma (HP Inc), and Keke Chen (Marquette University) Residue-Free Computing Logan Arkema (Georgetown University) and Micah Sherr (Georgetown University)

Session 4C: Privacy Awareness

Chair: Kassem Fawaz 17:30–19:00 Exploring Mental Models of the Right to Informational Self-Determination of Office Workers in Germany Jan Tolsdorf (Hochschule Bonn-Rhein-Sieg University of Applied Sciences), Florian Dehling (Hochschule Bonn-Rhein-Sieg University of Applied Sciences), Delphine Reinhardt (University of Göttingen), and Luigi Lo Iacono (Hochschule Bonn-Rhein-Sieg University of Applied Sciences) Awareness, Adoption, and Misconceptions of Web Privacy Tools Peter Story (Carnegie Mellon University), Daniel Smullen (Carnegie Mellon University), Yaxing Yao (Carnegie Mellon University), Alessandro Acquisti (Carnegie Mellon University), Lorrie Faith Cranor (Carnegie Mellon University), Norman Sadeh (Carnegie Mellon University), and Florian Schaub (University of Michigan) Defining Privacy: How Users Interpret Technical Terms in Privacy Policies Jenny Tang (Wellesley College), Hannah Shoemaker (Pomona College), Ada Lerner (Wellesley College), and Eleanor Birrell (Pomona College) Managing Potentially Intrusive Practices In The Browser: A User-Centered Perspective Daniel Smullen (Carnegie Mellon University), Yaxing Yao (University of Maryland, Baltimore County), Yuanyuan Feng (Carnegie Mellon University), Norman Sadeh (Carnegie Mellon University), Arthur Edelstein (Mozilla), and Rebecca Weiss (Mozilla)

19:00

Poster Session 4 and Break

19:00–19:40

19:40

Session 5A: Web Tracking

Chair: Umar Iqbal 19:40–21:10 A calculus of tracking: theory and practice artifact Giorgio Di Tizio (University of Trento) and Fabio Massacci (University of Trento) Déjà vu: Abusing Browser Cache Headers to Identify and Track Online Users artifact Vikas Mishra (Inria / Univ. Lille), Pierre Laperdrix (CNRS / Univ. Lille / Inria), Walter Rudametkin (Univ. Lille / Inria), and Romain Rouvoy (Univ. Lille / Inria) ML-CB: Machine Learning Canvas Block artifact Nathan Reitinger (UMD) and Michelle L. Mazurek (UMD) Unveiling Web Fingerprinting in the Wild Via Code Mining and Machine Learning Valentino Rizzo (Ermes Cyber Security SRL), Stefano Traverso (Ermes Cyber Security SRL), and Marco Mellia (Politecnico di Torino)

Session 5B: Applied Cryptography II

Chair: Wouter Lueks 19:40–21:10 SoK: Privacy-Preserving Reputation Systems Stan Gurtler (University of Waterloo) and Ian Goldberg (University of Waterloo) Fast Privacy-Preserving Punch Cards Saba Eskandarian (Stanford University) Unlinkable Updatable Hiding Databases and Privacy-Preserving Loyalty Programsartifact Aditya Damodaran (University of Luxembourg) and Alfredo Rial (University of Luxembourg) You May Also Like... Privacy: Recommendation Systems Meet PIR Adithya Vadapalli (Indiana University), Fattaneh Bayatbabolghani (UC Berkeley), and Ryan Henry (University of Calgary)

Session 5C: Internet of Things Privacy

Chair: Jason Xue 19:40–21:10 The Audio Auditor: User-Level Membership Inference in Internet of Things Voice Services Yuantian Miao (Swinburne University of Technology), Minhui Xue (The University of Adelaide), Chao Chen (Swinburne University of Technology), Lei Pan (Deakin University), Jun Zhang (Swinburne University of Technology), Benjamin Zi Hao Zhao (The University of New South Wales and CSIRO-Data61), Dali Kaafar (Macquarie University and CSIRO-Data61), and Yang Xiang (Swinburne University of Technology) Real-time Analysis of Privacy-(un)aware IoT Applications Leonardo Babun (Florida International University), Z. Berkay Celik (Purdue University), Patrick McDaniel (Pennsylvania Sate University), and A. Selcuk Uluagac (Florida International University) Blocking Without Breaking: Identification and Mitigation of Non-Essential IoT Traffic Anna Maria Mandalari (Imperial College London), Daniel J. Dubois (Northeastern University), Roman Kolcun (Imperial College London), Muhammad Talha Paracha (Northeastern University), Hamed Haddadi (Imperial College London), and David Choffnes (Northeastern University) Defending Against Microphone-Based Attacks with Personalized Noise Yuchen Liu (Indiana University Bloomington), Ziyu Xiang (Indiana University Bloomington), EJ Seong (Indiana University Bloomington), Apu Kapadia (Indiana University Bloomington), and Donald Williamson (Indiana University Bloomington)

21:10

Poster Session 5 and Break

21:10–21:50

21:50

Rump Session (Rochester Room)

Sign up here! Chair: Adam Aviv 21:50–23:00

23:00

PETS Happy Hour

Privacy-Themed Movie Night (The Social Dilemma) Ford Foundation Garden Women in PETS Host: Christine Fossaceca, Ford Foundation Room Accessible “Privacy Facts” Labels Host: Georgio Nicolas, Ford Foundation Room Karaoke and Open Mic! Karaoke Lounge in the Ford Foundation Garden Open Unconference Sessions Ford Foundation Room Games Hang out in the Game Zone

5:00 (July 15)

PETS After Dark (in the US) Poster Session & Mingle

Building and Deploying PETS Host: Florian Kerschbaum, Ford Foundation Room Privacy-Themed Movie Night (The Social Dilemma) Ford Foundation Garden Karaoke and Open Mic! Karaoke Lounge in the Ford Foundation Garden Open Unconference Sessions Ford Foundation Room Games Hang out in the Game Zone

17:30

Session 6A: Censorship and Certificates

Chair: Rebekah Overdorf 17:30–19:00 Too Close for Comfort: Morasses of (Anti-) Censorship in the Era of CDNs Devashish Gosain (IIIT Delhi), Mayank Mohindra (IIIT Delhi), and Sambuddho Chakravarty (IIIT Delhi) A First Look at Private Communications in Video Games using Visual Features Abdul Wajid (SEECS, NUST), Nasir Kamal (SEECS, NUST), Muhammad Sharjeel (SEECS, NUST), Raaez Muhammad Sheikh (SEECS, NUST), Huzaifah Bin Wasim (SEECS, NUST), Muhammad Hashir Ali (SEECS, NUST), Wajahat Hussain (SEECS, NUST), Syed Taha Ali (SEECS, NUST), and Latif Anjum (SEECS, NUST) Privacy-Preserving & Incrementally-Deployable Support for Certificate Transparency in Tor artifact Rasmus Dahlberg (Karlstad University), Tobias Pulls (Karlstad University), Tom Ritter (Mozilla), and Paul Syverson (U.S. Naval Research Laboratory) LogPicker: Strengthening Certificate Transparency against covert adversaries Alexandra Dirksen (TU Braunschweig), David Klein (TU Braunschweig), Robert Michael (TU Braunschweig), Tilman Stehr (), Konrad Rieck (TU Braunschweig), and Martin Johns (TU Braunschweig)

Session 6B: Cryptography and Cryptocurrencies

Chair: Markulf Kohlweiss 17:30–19:00 SwapCT: Swap Confidential Transactions for Privacy-Preserving Multi-Token Exchanges Felix Engelmann (Aarhus University), Lukas Müller (Ulm University), Andreas Peter (University of Twente), Frank Kargl (Ulm University), and Christoph Bösch (Ulm University) HashWires: Hyperefficient Credential-Based Range Proofs Konstantinos Chalkias (Facebook / Novi), Shir Cohen (Technion), Kevin Lewi (Facebook / Novi), Fredric Moezinia (Facebook / Novi), and Yolan Romailler (Facebook / Novi) Gage MPC: Bypassing Residual Function Leakage for Non-Interactive MPC Ghada Almashaqbeh (University of Connecticut), Fabrice Benhamouda (Algorand Foundation), Seungwook Han (Columbia University), Daniel Jaroslawicz (Columbia University), Tal Malkin (Columbia University), Alex Nicita (Columbia University), Tal Rabin (UPenn University and Algorand Foundation), Abhishek Shah (Columbia University), and Eran Tromer (Columbia University and Tel Aviv University) Foundations of Ring Sampling Viktoria Ronge (Friedrich-Alexander University Erlangen-Nuremberg), Christoph Egger (Friedrich-Alexander University Erlangen-Nuremberg), Russell W. F. Lai (Friedrich-Alexander University Erlangen-Nuremberg), Dominique Schröder (Friedrich-Alexander University Erlangen-Nuremberg), and Hoover H. F. Yin (The Chinese University of Hong Kong)

Session 6C: Mobile Privacy

Chair: Sébastien Gambs 17:30–19:00 zkSENSE: A Friction-less Privacy-Preserving HumanAttestation Mechanism for Mobile Devices Iñigo Querejeta-Azurmendi (Universidad Carlos III Madrid / ITFI CSIC), Panagiotis Papadopoulos (Telefonica), Matteo Varvello (Nokia Bell Labs), Antonio Nappa (University of California, Berkeley), Jiexin Zhang (Cambridge University), and Benjamin Livshits (Brave Software / Imperial College London) Less is More: A privacy-respecting Android malware classifier using federated learning Rafa Gálvez (imec-COSIC ESAT/KU Leuven), Veelasha Moonsamy (Ruhr University Bochum), and Claudia Diaz (imec-COSIC ESAT/KU Leuven) Three Years Later: A Study of MAC Address Randomization In Mobile Devices And When It Succeeds Ellis Fenske (US Naval Academy), Dane Brown (US Naval Academy), Jeremy Martin (MITRE), Travis Mayberry (US Naval Academy), Peter Ryan, Jr. (MITRE), and Erik Rye (CMAND) Who Can Find My Devices? Security and Privacy of Apple's Crowd-Sourced Bluetooth Location Tracking System artifact Milan Stute (Technical University of Darmstadt), Alexander Heinrich (Technical University of Darmstadt), Tim Kornhuber (Technical University of Darmstadt), and Matthias Hollick (Technical University of Darmstadt)

19:00

Poster Session 6 and Break

19:00–19:40

19:40

Session 7A: Website Fingerprinting

Chair: Marc Juarez 19:40–20:50 Website Fingerprinting in the Age of QUICartifact Jean-Pierre Smith (ETH Zurich), Adrian Perrig (ETH Zurich), and Prateek Mittal (Princeton) GANDaLF: GAN for Data-Limited Fingerprinting Se Eun Oh (University of Minnesota), Nate Mathews (Rochester Institute of Technology), Mohammad Saidur Rahman (Rochester Institute of Technology), Matthew Wright (Rochester Institute of Technology), and Nicholas Hopper (University of Minnesota) Domain name encryption is not enough: privacy leakage via IP-based website fingerprinting Nguyen Phong Hoang (Stony Brook University), Arian Akhavan Niaki (University of Massachusetts, Amherst), Phillipa Gill (University of Massachusetts, Amherst), and Michalis Polychronakis (Stony Brook University)

Session 7B: Secure Multiparty Computation

Chair: Meisam Mohammady 19:40–20:50 Fortified Multi-Party Computation: Taking Advantage of Simple Secure Hardware Modules Brandon Broadnax, Alexander Koch (Karlsruhe Institute of Technology), Jeremias Mechler (Karlsruhe Institute of Technology), Tobias Müller (FZI Research Center for Information Technology), Jörn Müller-Quade (Karlsruhe Institute of Technology), and Matthias Nagel Secure integer division with a private divisor Mark Abspoel (CWI) and Thijs Veugen (TNO) Multiparty Homomorphic Encryption from Ring-Learning-With-Errors Christian Mouchet (EPFL), Juan Troncoso-Pastoriza (EPFL), Jean-Philippe Bossuat (EPFL), and Jean-Pierre Hubaux (EPFL)

Session 7C: DNS and Privacy

Chair: Tariq Elahi 19:40–20:50 Oblivious DNS over HTTPS (ODoH): A Practical Privacy Enhancement to DNS Sudheesh Singanamalla (University of Washington / Cloudflare Inc.), Suphanat Chunhapanya (Cloudflare Inc.), Jonathan Hoyland (Cloudflare Inc.), Marek Vavruša (Cloudflare Inc.), Tanya Verma (Cloudflare Inc.), Peter Wu (Cloudflare Inc.), Marwan Fayed (Cloudflare Inc.), Kurtis Heimerl (University of Washington), Nick Sullivan (Cloudflare Inc.), and Christopher Wood (Cloudflare Inc.) The CNAME of the Game: Large-scale Analysis of DNS-based Tracking Evasion Yana Dimova (imec-DistriNet, KU Leuven), Gunes Acar (imec-COSIC, KU Leuven), Lukasz Olejnik (European Data Protection Supervisor), Wouter Joosen (imec-DistriNet, KU Leuven), and Tom Van Goethem (imec-DistriNet, KU Leuven) Holes in the Geofence: Privacy Vulnerabilities in "Smart" DNS Services Rahel A. Fainchtein (Georgetown University), Adam A. Aviv (The George Washington University), Micah Sherr (Georgetown University), Stephen Ribaudo (Georgetown University), and Armaan Khullar (Georgetown University)

20:50

Poster Session 7 and Break

20:50–21:30

21:30

Awards Session (Rochester Room)

21:50

Closing Remarks (Rochester Room)

22:00

PETS Happy Hour

Ask Me Anything (AMA) with Roger Dingledine of Tor! AMA room in the Ford Foundation Garden PETS Pet Gala Karaoke Lounge in the Ford Foundation Garden Open Unconference Sessions Ford Foundation Room Games Hang out in the Game Zone

5:00 (July 16)

PETS After Dark (in the US) Poster Session & Mingle

Censorship Resistance with David Fifield Host: David Fifield, Ford Foundation Room PETs for defending dissent Host: Glencora Borradaile, Ford Foundation Room Open Unconference Sessions Ford Foundation Room PETS Rewind and Poster Session Games Hang out in the Game Zone

17:30

Opening remarks (Rochester Room)

17:35

Session 1

17:35–18:50 Rochester Room Leveraging Strategic Connection Migration-Powered Traffic Splitting for Privacy Mona Wang (Princeton)

Abstract: Network-level adversaries have been developing increasingly sophisticated techniques to perform surveillance and exert control over user traffic. We present a novel Connection Migration Powered Splitting (CoMPS) framework to construct multiple new defenses against various traffic analysis attacks. CoMPS limits the amount of information a particular adversary can observe on the network by performing traffic splitting within individual sessions. CoMPS is the first to fully support mid-session traffic splitting across heterogeneous network paths and protocols, without the need for deploying additional network infrastructure. CoMPS is not only readily deployable with any protocol supporting connection migration (e.g., QUIC, WireGuard, and Mosh), but incurs very little overhead.

We implement a working prototype of CoMPS and use CoMPS to develop a novel defense against website fingerprinting attacks. To evaluate the effectiveness of our defense, we use both simulated splitting data and web traffic that is split real-time using our prototype. Our defense outperforms other state-of-the-art web fingerprinting defenses against a powerful, adaptive adversary, while incurring smaller overhead (decreasing throughput by just 7%). We also propose this framework for other network privacy use cases, such as censorship circumvention.

Honest-but-Curious Nets: Sensitive Information about Private Inputs can be Secretly Coded into the Outputs of Machine Learning Classifiers Mohammad Malekzadeh (Imperial College London)

Abstract: It is known that deep neural networks, trained for the classification of a non-sensitive target attribute, can reveal sensitive attributes of their input data; through features of different granularity extracted by the classifier. We, taking a step forward, show that deep classifiers can be trained to secretly encode a sensitive attribute of users' input data, at inference time, into the classifier's outputs for the target attribute. An attack that works even if users have a white-box view of the classifier, and can keep all internal representations hidden except for the classifier's estimation of the target attribute. We introduce an information-theoretical formulation of such adversaries and present efficient empirical implementations for training honest-but-curious (HBC) classifiers based on this formulation: deep models that can be accurate in predicting the target attribute, but also can utilize their outputs to secretly encode a sensitive attribute. Our evaluations on several tasks in real-world datasets show that a semi-trusted server can build a classifier that is not only perfectly honest but also accurately curious. Our work highlights a vulnerability that can be exploited by malicious machine learning service providers to attack their user's privacy in several seemingly safe scenarios; such as encrypted inferences, computations at the edge, or private knowledge distillation. We conclude by showing the difficulties in distinguishing between standard and HBC classifiers and discussing an extension of this attack to a more general setting where, by allowing a few more queries, an attacker cannot only infer a sensitive attribute, but it also can (approximately) reconstruct the whole private input.

Adversarial Detection Avoidance Attacks: Evaluating the robustness of perceptual hashing-based client-side scanning Shubham Jain (Imperial College London)

Abstract: End-to-end encryption (E2EE) in messaging platforms enable people to securely and privately communicate with one another. Its widespread adoption has however raised concerns that illegal content might now be shared undetected. Following the global pushback against key escrow systems, client-side scanning based on perceptual hashing has been recently proposed by governments and researchers to detect illegal content in E2EE communications. In this talk, we will present what is to the best of our knowledge the first framework to evaluate the robustness of perceptual hashing-based client-side scanning. We will describe a new class of detection avoidance attacks and show current systems to not be robust.

More specifically, we will present a general black-box attack against any perceptual hashing algorithm and two white-box attacks for discrete cosine-based algorithms. We show perceptual hashing-based client-side scanning mechanisms to be highly vulnerable to detection avoidance attacks in a black-box setting. We show in a large-scale evaluation that more than 99.9% of images can be successfully attacked while preserving the content of the image. Furthermore, we show our attack to generate diverse perturbations, strongly suggesting that straightforward mitigation strategies would be ineffective. Finally, we show that the larger thresholds necessary to make the attack harder would probably require more than one billion images to be flagged and decrypted daily, raising strong privacy concerns. Taken together, our results shed serious doubts on the robustness of perceptual hashing-based client-side scanning mechanisms currently proposed by governments, organizations, and researchers around the world.

18:50

Break

19:15

PETs and DPAs: perfect is the enemy of good

Marit Hansen 19:15–20:15 Rochester Room Abstract: The European General Data Protection Regulation (GDPR) has changed the data protection regime throughout Europe, and, in a globalised world, it also affects data controllers outside the EU. Article 25 of the GPDR demands data protection by design and by default. Is now everybody legally required to use PETs? Will there be sanctions for those who ignore Article 25? Are data controllers obliged to go for the perfect solution with the highest data protection guarantees? In her talk, Marit will answer those questions and show which role PETs play in the daily work of Data Protection Authorities (DPAs). Obviously, today's IT is still lacking built-in data protection: How can this change? Marit will give some insight from the perspective of DPAs to point out what (else) is needed to effectively promote the idea of data protection by design and what other instruments could complement the endeavors of the different stakeholders to put PETs into practice.

Marit Hansen has been the State Data Protection Commissioner of Land Schleswig-Holstein and Chief of Unabhängiges Landeszentrum für Datenschutz (ULD) since 2015. Before being appointed Data Protection Commissioner, she had been Deputy Commissioner for seven years. Within ULD she established the “Privacy Technology Projects” Division and the “Innovation Centre Privacy & Security”.

Since her diploma in computer science in 1995 she has been working on privacy and security aspects. Marit's focus is on “data protection by design” and “data protection by default” from both the technical and the legal perspectives. She often gives talks and has been lecturing at various universities and academies.

20:15

Break

21:15

Session 2

21:15–22:30 Rochester Room CVEs from CNN Joel Reardon (University of Calgary)

Abstract: Sometimes ads and analytics libraries behave more in line with malware, doing things like actively circumventing operating system protections, doing home network scans, and encrypting all their strings to be decrypted at runtime. And these libraries are included in apps with hundreds of millions of installs! In this talk, we'll look at sample set of these more extreme behaviours we've found over time and look at what they are doing, and the steps we followed to actually figure out what they are doing.

(Un)clear and (In)conspicuous: The right to opt-out of sale under CCPA Eleanor Birrell (Pomona College)

Abstract: The California Consumer Privacy Act (CCPA)---which began enforcement on July 1, 2020---grants California users the affirmative right to opt-out of the sale of their personal information. In this work, we perform a series of observational studies to understand how websites implement this right and how this implementation has evolved over the first year. We perform manual analyses of the top 500 U.S. websites and classify how each site implements this new requirement; e also perform automated analyses of the Top 5000 U.S. websites. We find that the vast majority of sites that implement opt-out mechanisms do so with a Do Not Sell link rather than with a privacy banner, and that many of the linked opt-out controls exhibit features such as nudging and indirect mechanisms (e.g., fillable forms). We then perform a pair of user studies with 4357 unique users (recruited from Google Ads and Amazon Mechanical Turk) in which we observe how users interact with different opt-out mechanisms and evaluate how the implementation choices we observed---exclusive use of links, prevalent nudging, and indirect mechanisms---affect the rate at which users exercise their right to opt-out of sale. We find that these design elements significantly deter interactions with opt-out mechanisms---including reducing the opt-out rate for users who are uncomfortable with the sale of their information---and that they reduce users' awareness of their ability to opt-out. Our results demonstrate the importance of regulations that provide clear implementation requirements in order empower users to exercise their privacy rights.

Harm reduction for cryptographic backdoors Martin Kleppmann (University of Cambridge)

Abstract: When law enforcement agencies (LEAs) ask for backdoors in end-to-end encryption systems, most information security professionals' reaction is wholesale rejection. This pushes law LEAs to use zero-day exploits instead, which is harmful to security overall. Perhaps it would be better to have an explicit backdoor mechanism that ensures accountability, and which has safeguards to prevent it being used for mass surveillance.

I propose the following: a provider of a communication service (e.g. Facebook) maintains a publicly readable transparency log, similar to Certificate Transparency, containing all of the law enforcement intercept orders they have received and accepted. Each log entry contains the jurisdiction of the warrant, a code indicating the reason (terrorism, child sexual abuse, etc.), validity start and end date, and a cryptographic commitment to a single device ID that is the target of the warrant. Thus, anybody can see how many warrants are being issued in which jurisdiction and for which reason, but not who their targets are.

To intercept a device, the communication service provider must first add the entry to the log, then send a message to the device that reveals the device ID in the commitment, and a proof that the entry is included in the log. The software on the user's device checks whether the log entry is for its own device ID, and if it is valid, the software silently uploads a cleartext copy of the requested data to the appropriate LEA. This upload feature is essentially identical to the cloud backup feature that is already built into otherwise encrypted messaging apps such as WhatsApp and iMessage; the only effect of the backdoor is to enable this backup, even if it had been disabled by the user.

Additionally, in each jurisdiction there is a trusted oversight board. The service provider reveals the target of each log entry to the oversight board in the appropriate jurisdiction; the board checks that each log entry has a corresponding warrant, and that the warrant is genuine and legal. If the board determines that the system is being abused, it has legal powers to stop it.

Unlike key escrow and other backdoor proposals, this approach ensures the backdoor cannot be used without leaving a public audit trail, and it does not involve any weakening of the cryptographic protocols. There is no single "golden key" that can decrypt all communications. Service providers are forced to be explicit about the jurisdictions in which they will accept warrants. The number of targeted users is public, which gives us reassurance that the system is not being used for mass surveillance.

Moreover, the system is simple enough that non-technical people can understand it. It places more faith in established democratic structures (e.g. the judiciary and our democratically elected representatives) and less trust in unaccountable tech companies. LEAs have a democratic mandate to investigate crimes, and I believe this proposal enables LEAs to do their job, while also protecting the civil liberties that form the foundation of a democratic society.

22:30

Awards and Closing (Rochester Room)

22:45

Virtual Ice Cream (Ford Foundation Garden)