Guodong Long
Guodong Long
Home
News
Professional Services & Talks
Publications
Projects
Research Showcase
Teams
Apply Ph.D
Contact
Tao Shen
Latest
Multi-Task Learning for Conversational Question Answering over a Large-Scale Knowledge Base
Self-Attention Enhanced Selective Gate with Entity-Aware Embedding for Distantly Supervised Relation Extraction
Bi-directional block self-attention for fast and memory-efficient sequence modeling
Disan: Directional self-attention network for rnn/cnn-free language understanding
Reinforced self-attention network: a hybrid of hard and soft attention for sequence modeling
Tensorized self-attention: Efficiently modeling pairwise and global dependencies together
Cite
×