Position: Considerations for Differentially Private Learning with Large-Scale Public Pretraining

Jul 22, 2024

Speakers

About

The performance of differentially private machine learning can be boosted significantly by leveraging the transfer learning capabilities of non-private models pretrained on large *public* datasets. We critically review this approach. We primarily question whether the use of large Web-scraped datasets *should* be viewed as differential-privacy-preserving. We further scrutinize whether existing machine learning benchmarks are appropriate for measuring the ability of pretrained models to generalize to sensitive domains. Finally, we observe that reliance on large pretrained models may lose *other* forms of privacy, requiring data to be outsourced to a more compute-powerful third party.

Organizer

Like the format? Trust SlidesLive to capture your next event!

Professional recording and live streaming, delivered globally.

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow ICML 2024