• español
    • English
  • Login
  • español 
    • español
    • English
  • Tipos de Publicaciones
    • bookbook partconference objectdoctoral thesisjournal articlemagazinemaster thesispatenttechnical documentationtechnical report
Ver ítem 
  •   IMDEA Networks Principal
  • Ver ítem
  •   IMDEA Networks Principal
  • Ver ítem
JavaScript is disabled for your browser. Some features of this site may not work without it.

PriPrune: Quantifying and Preserving Privacy in Pruned Federated Learning

Compartir
Ficheros
PriPrune_ACM_TOMPECS-1.pdf (5.137Mb)
Identificadores
URI: https://hdl.handle.net/20.500.12761/1896
DOI: 10.1145/3702241
Metadatos
Mostrar el registro completo del ítem
Autor(es)
Chu, Tianyue; Yang, Mengwei; Laoutaris, Nikolaos; Markopoulou, Athina
Fecha
2024-11-02
Resumen
Model pruning has been proposed as a technique for reducing the size and complexity of Federated learning (FL) models. By making local models coarser, pruning is intuitively expected to improve protection against privacy attacks. However, the level of this expected privacy protection has not been previously characterized, or optimized jointly with utility. In this paper, we first characterize the privacy offered by pruning. We establish information-theoretic upper bounds on the information leakage from pruned FL and we experimentally validate them under state-of-the-art privacy attacks across different FL pruning schemes. Second, we introduce PriPrune – a privacy-aware algorithm for pruning in FL. PriPrune uses defense pruning masks, which can be applied locally after any pruning algorithm, and adapts the defense pruning rate to jointly optimize privacy and accuracy. Another key idea in the design of PriPrune is Pseudo-Pruning: it undergoes defense pruning within the local model and only sends the pruned model to the server; while the weights pruned out by defense mask are withheld locally for future local training rather than being removed. We show that PriPrune significantly improves the privacy-accuracy tradeoff compared to state-of-the-art pruned FL schemes. For example, on the FEMNIST dataset, PriPrune improves the privacy of PruneFL by 45.5% without reducing accuracy.
Compartir
Ficheros
PriPrune_ACM_TOMPECS-1.pdf (5.137Mb)
Identificadores
URI: https://hdl.handle.net/20.500.12761/1896
DOI: 10.1145/3702241
Metadatos
Mostrar el registro completo del ítem

Listar

Todo IMDEA NetworksPor fecha de publicaciónAutoresTítulosPalabras claveTipos de contenido

Mi cuenta

Acceder

Estadísticas

Ver Estadísticas de uso

Difusión

emailContacto person Directorio wifi Eduroam rss_feed Noticias
Iniciativa IMDEA Sobre IMDEA Networks Organización Memorias anuales Transparencia
Síguenos en:
Comunidad de Madrid

UNIÓN EUROPEA

Fondo Social Europeo

UNIÓN EUROPEA

Fondo Europeo de Desarrollo Regional

UNIÓN EUROPEA

Fondos Estructurales y de Inversión Europeos

© 2021 IMDEA Networks. | Declaración de accesibilidad | Política de Privacidad | Aviso legal | Política de Cookies - Valoramos su privacidad: ¡este sitio no utiliza cookies!