论文部分内容阅读
It is common to fine-tune pre-trained word embeddings in text categorization.However,we find that fine-tuning does not guarantee improvement across text categorization datasets,while could introduce considerable parameters to model.In this paper,we study new transfer methods to solve the problems above,and propose “Robustness of OOVs” to provide a perspective to reduce memory consumption further.The experimental results show that the proposed method is proved to be a good alternative to fine-tuning method on large dataset.