Words or Characters? Fine-grained Gating for Reading Comprehension

View Researcher's Other Codes

Disclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).

Please contact us in case of a broken link from here

Authors Junjie Hu, Ye Yuan, Zhilin Yang, Bhuwan Dhingra, William W. Cohen, Ruslan Salakhutdinov
Journal/Conference Name 5th International Conference on Learning Representations, ICLR 2017 - Conference Track Proceedings
Paper Category
Paper Abstract Previous work combines word-level and character-level representations using concatenation or scalar weighting, which is suboptimal for high-level tasks like reading comprehension. We present a fine-grained gating mechanism to dynamically combine word-level and character-level representations based on properties of the words. We also extend the idea of fine-grained gating to modeling the interaction between questions and paragraphs for reading comprehension. Experiments show that our approach can improve the performance on reading comprehension tasks, achieving new state-of-the-art results on the Children's Book Test dataset. To demonstrate the generality of our gating mechanism, we also show improved results on a social media tag prediction task.
Date of publication 2016
Code Programming Language Python
Comment

Copyright Researcher 2022