Efficient Methods for Natural Language Processing
Last updated: 2022-12-04 Sunday
Efficient Methods for Natural Language Processing
Is a podcast interview with Roy Schwartz and can be found here](https://thedataexchange.media/efficient-methods-for-natural-language-processing/) or in your podcast
Roy Schwartz is Professor of Natural Language Processing at The Hebrew University of Jerusalem. We discussed a recent survey paper (co-written by Roy) that presents a broad overview of existing methods to improve NLP efficiency through the lens of traditional NLP pipelines. Our conversation covered the following key areas:
Data: Using fewer training instances or better utilizing those available can increase efficiency.
Model Design
Training: pre-training and fine tuning.
Inference and compression
what do I think about it
I was really surprised by the efficient data use steps. It is possible to maximize training efficiency by training on the most difficult samples. This is sort of related to creating sub–samples using a maximum dissimilarity approach