Meta AI’s LegoNN Builds Decoder Modules That Are Reusable Across Diverse Language Tasks Without Fine-Tuning

Encoder-decoder models have become the preferred approach for a wide range of language-related tasks. Although some common logical functions are shared between different tasks, most contemporary encoder-decoder models are trained end-to-end for a specified task. This specialization increases the compute…