A Structured Self-attentive Sentence Embedding

View Researcher's Other Codes

Disclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).

Please contact us in case of a broken link from here

Download
Authors Mo Yu, Zhouhan Lin, Yoshua Bengio, Minwei Feng, Bowen Zhou, Cicero Nogueira dos Santos, Bing Xiang
Journal/Conference Name 5th International Conference on Learning Representations, ICLR 2017 - Conference Track Proceedings
Paper Category
Paper Abstract This paper proposes a new model for extracting an interpretable sentence embedding by introducing self-attention. Instead of using a vector, we use a 2-D matrix to represent the embedding, with each row of the matrix attending on a different part of the sentence. We also propose a self-attention mechanism and a special regularization term for the model. As a side effect, the embedding comes with an easy way of visualizing what specific parts of the sentence are encoded into the embedding. We evaluate our model on 3 different tasks author profiling, sentiment classification, and textual entailment. Results show that our model yields a significant performance gain compared to other sentence embedding methods in all of the 3 tasks.
Date of publication 2017
Code Programming Language Multiple
Comment

Copyright Researcher 2022