Logo del repository
  1. Home
 
Opzioni

Distilled Gradual Pruning with Pruned Fine-tuning

Fontana F.
•
Lanzino R.
•
Marini M. R.
altro
Foresti G. L.
2024
  • journal article

Periodico
IEEE TRANSACTIONS ON ARTIFICIAL INTELLIGENCE
Abstract
Neural Networks (NNs) have been driving machine learning progress in recent years, but their larger models present challenges in resource-limited environments. Weight pruning reduces the computational demand, often with performance degradation and long training procedures. This work introduces Distilled Gradual Pruning with Pruned Fine-tuning (DG2PF), a comprehensive algorithm that iteratively prunes pre-trained neural networks using knowledge distillation. We employ a magnitude-based unstructured pruning function that selectively removes a specified proportion of unimportant weights from the network. This function also leads to an efficient compression of the model size while minimizing classification accuracy loss. Additionally, we introduce a simulated pruning strategy with the same effects of weight recovery but while maintaining stable convergence. Furthermore, we propose a multi-step self-knowledge distillation strategy to effectively transfer the knowledge of the full, unpruned network to the pruned counterpart. We validate the performance of our algorithm through extensive experimentation on diverse benchmark datasets, including CIFAR-10 and ImageNet, as well as a set of model architectures. The results highlight how our algorithm prunes and optimizes pre-trained neural networks without substantially degrading their classification accuracy while delivering significantly faster and more compact models.
DOI
10.1109/TAI.2024.3366497
Archivio
https://hdl.handle.net/11390/1271966
info:eu-repo/semantics/altIdentifier/scopus/2-s2.0-85185387701
https://ricerca.unityfvg.it/handle/11390/1271966
Diritti
metadata only access
Soggetti
  • Artificial intelligen...

  • Artificial intelligen...

  • Classification algori...

  • Computational modelin...

  • Computer architecture...

  • deep learning

  • Knowledge engineering...

  • neural network

  • Schedule

  • supervised learning

  • Training

google-scholar
Get Involved!
  • Source Code
  • Documentation
  • Slack Channel
Make it your own

DSpace-CRIS can be extensively configured to meet your needs. Decide which information need to be collected and available with fine-grained security. Start updating the theme to match your nstitution's web identity.

Need professional help?

The original creators of DSpace-CRIS at 4Science can take your project to the next level, get in touch!

Realizzato con Software DSpace-CRIS - Estensione mantenuta e ottimizzata da 4Science

  • Impostazioni dei cookie
  • Informativa sulla privacy
  • Accordo con l'utente finale
  • Invia il tuo Feedback