Avatar
Pierre Tholoniat
Ph.D. Student in Computer Science, Columbia University

I am a Ph.D. student in computer science at Columbia University, working with Asaf Cidon and Roxana Geambasu. I am researching security and privacy, with a focus on differential privacy, applied cryptography and distributed systems. Lately, I have been working on infrastructure systems for differential privacy, applied to privacy-preserving machine learning and analytics.

More generally, I enjoy solving socially meaningful problems by building practical systems that rely on sound theoretical foundations. I also like contributing to open-source projects. I’m always happy to talk about research and privacy – feel free to contact me at [email protected].

A bit more about me: I did my undergrad in mathematics and computer science at Ecole Polytechnique, in France. My name is pronounced [pjɛʁ to.lo.ɲa].

Resume

Education

  • Columbia University. M.S. & Ph.D. in Computer Science (2019 – current). New York, NY.

  • École Polytechnique. B.S. & M.S. in Engineering (2016 – 2019). Palaiseau, FR.

  • Lycée Sainte-Geneviève. Prépa in Mathematics and Physics (2014 – 2016). Versailles, FR.

Experience

  • Columbia University. Graduate Research Assistant (2021 - current). New York, NY.

  • Cloudflare. Research Intern (2023). San Francisco, CA.

  • Microsoft Research. Research Intern (2022). Redmond, WA.

  • École Normale Supérieure. Research Intern (2020). Paris, FR.

Before coming to Columbia for grad school, I spent some time in academic labs, the startup world and the military:

  • The University of Sydney. Visiting Researcher (2019). Sydney, AU.

  • Muvee Technologies. Software Engineering Intern (2018). Singapore, SG.

  • French Armed Forces. Officer cadet (2016 – 2017). Tahiti, PF.

My 2-page resume (PDF) and my LinkedIn profile have more details.

Research

Infrastructure systems for differential privacy

At Columbia, I am designing and implementing infrastructure systems for differential privacy.

  • We implemented PrivateKube (published at OSDI ‘21), a system to manage privacy budget as a resource (akin to CPU or RAM) on Kubernetes clusters.
  • We also explored more efficient scheduling algorithms, such as DPack, that make the most out of this limited resource.
  • More recently, I’ve been working on new designs for private database systems. We presented Turbo, a caching layer for statistical workloads, at SOSP ‘23.

Check out my lab’s webpage to learn more about these projects.

Past projects

  • Large-scale private machine learning. I collaborated with a team at Brookhaven National Laboratory that designs and evaluates large scale privacy-preserving machine learning systems running on top of the U.S. Department of Energy’s supercomputers. We improved support for distributed training in PyTorch’s differential privacy library, Opacus. I worked on similar problems during my internship at Microsoft Research’s Privacy in AI team, where I added differential privacy to mixture-of-experts transformers, a powerful type of large language models.

  • Function secret sharing for encrypted deep learning. I am maintaining an open-source library for OpenMined. Sycret is the first Python library for function secret sharing. It relies on an efficient parallelized Rust backend using cryptographic hardware acceleration. We plugged it into PySyft, one of the leading privacy-preserving machine learning frameworks, to build ARIANN: Low-Interaction Privacy-Preserving Deep Learning via Function Secret Sharing (published at PETS ‘22).

  • Distributed algorithms. I spent 5 months in 2019 at the Concurrent Systems Research Group of the University of Sydney, where I studied cross-chain protocols and the formal verification of distributed algorithms with Vincent Gramoli and Rob van Glabbeek. In 2020 we worked on a compositional approach to model-check consensus algorithms. Our research appeared at SPAA ‘20, PODC ‘22 and DISC ‘22.

Publications

See my Google Scholar profile for a more exhaustive list.