Transfer Learning for Structured Pruning under Limited Task Data Apple Machine Learning Research
[[{“value”:”This paper was accepted at the Efficient Natural Language and Speech Processing (ENLSP-III) Workshop at NeurIPS. Large, pre-trained models are problematic to use in resource constrained applications. Fortunately, task-aware structured pruning methods offer a solution. These approaches reduce model size by dropping structural units like… Read More »Transfer Learning for Structured Pruning under Limited Task Data Apple Machine Learning Research