ACL2020专题论文大放送

论文分享(一)

Posted by MeteorMan on July 13, 2020

MeteorMan将在本文针对ACL2020开源论文分专题介绍。

如今ACL2020已结束,各大论文已经放出,眼热的我针对里面的论文资源依据个人兴趣分门别类整理,并特别针对开源论文进一步处理,将其划分为如下4大专题:

  • 问答系统和阅读理解
  • 问题生成
  • 自然语言推理
  • 预训练语言模型及应用

1.QA问答系统及机器阅读理解

Harvesting and Refining Question-Answer Pairs for Unsupervised QA

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.600.pdf
  • 代码链接:https://github.com/Neutralzz/RefQA

Probabilistic Assumptions Matter: Improved Models for Distantly-Supervised Document-Level Question Answering

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.501.pdf
  • 代码链接:https://github.com/hao-cheng/ds_doc_qa

Template-Based Question Generation from Retrieved Sentences for Improved Unsupervised Question Answering

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.413.pdf
  • 代码链接:https://github.com/awslabs/unsupervised-qa

Contextualized Sparse Representations for Real-Time Open-Domain Question Answering

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.85.pdf
  • 代码链接:https://github.com/jhyuklee/sparc

Recurrent Chunking Mechanisms for Long-Text Machine Reading Comprehension

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.603.pdf
  • 代码链接:https://github.com/HongyuGong/RCM-Question-Answering.git

Document Modeling with Graph Attention Networks for Multi-grained Machine Reading Comprehension

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.599.pdf
  • 代码链接:https://github.com/DancingSoul/NQ_BERT-DM

Unsupervised Alignment-based Iterative Evidence Retrieval for Multi-hop Question Answering

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.414.pdf
  • 代码链接:https://github.com/vikas95/AIR-retriever

Improving Multi-hop Question Answering over Knowledge Graphs using Knowledge Base Embeddings

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.412.pdf
  • 代码链接:https://github.com/malllabiisc/EmbedKGQA

2.问题生成

ClarQ: A large-scale and diverse dataset for Clarification Question Generation

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.651.pdf
  • 代码链接:https://github.com/vaibhav4595/ClarQ

Semantic Graphs for Generating Deep Questions

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.135.pdf
  • 代码链接:https://github.com/WING-NUS/SG-Deep-Question-Generation

Syn-QG: Syntactic and Shallow Semantic Rules for Question Generation

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.69.pdf
  • 代码链接:https://bitbucket.org/kaustubhdhole/syn-qg/

Unsupervised FAQ Retrieval with Question Generation and BERT

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.74.pdf
  • 代码链接:https://github.com/YosiMass/faq-retrieval

3.自然语言推理(文本蕴含or文本匹配)

Towards Robustifying NLI Models Against Lexical Dataset Biases

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.773.pdf
  • 代码链接:https://github.com/owenzx/LexicalDebias-ACL2020

NILE : Natural Language Inference with Faithful Natural Language Explanations

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.771.pdf
  • 代码链接:https://github.com/SawanKumar28/nile

Extracting Headless MWEs from Dependency Parse Trees: Parsing, Tagging, and Joint Modeling Approaches

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.775.pdf
  • 代码链接:https://github.com/tzshi/flat-mwe-parsing

Uncertain Natural Language Inference

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.774.pdf
  • 代码链接:https://github.com/ctongfei/unli

4.预训练语言模型及部分应用

QuASE: Question-Answer Driven Sentence Encoding

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.772.pdf
  • 代码链接:https://github.com/CogComp/QuASE

TaBERT: Pretraining for Joint Understanding of Textual and Tabular Data

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.745.pdf
  • 代码链接:https://github.com/facebookresearch/TaBERT

Don’t Stop Pretraining: Adapt Language Models to Domains and Tasks

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.740.pdf
  • 代码链接:https://github.com/allenai/dont-stop-pretraining

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.703.pdf
  • 代码链接:https://github.com/pytorch/fairseq

Toward Better Storylines with Sentence-Level Language Models

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.666.pdf
  • 代码链接:https://github.com/google-research/google-research/tree/master/better_storylines

tBERT: Topic Models and BERT Joining Forces for Semantic Similarity Detection

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.630.pdf
  • 代码链接:https://github.com/wuningxi/tBERT

FastBERT: a Self-distilling BERT with Adaptive Inference Time

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.537.pdf
  • 代码链接:https://github.com/autoliuweijie/FastBERT

Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.439.pdf
  • 代码链接:https://github.com/google-research/language/tree/master/language/conpono

DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.411.pdf
  • 代码链接:https://github.com/StonyBrookNLP/deformer

Enhancing Pre-trained Chinese Character Representation with Word-aligned Attention

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.315.pdf
  • 代码链接:https://github.com/lsvih/MWA

Span Selection Pre-training for Question Answering

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.247.pdf
  • 代码链接:https://github.com/IBM/span-selection-pretraining

DeeBERT: Dynamic Early Exiting for Accelerating BERT Inference

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.204.pdf
  • 代码链接:https://github.com/castorini/DeeBERT

MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.195.pdf
  • 代码链接:https://github.com/google-research/google-research/tree/master/mobilebert

Fast and Accurate Deep Bidirectional Language Representations for Unsupervised Learning

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.76.pdf
  • 代码链接:https://github.com/joongbo/tta

Few-Shot NLG with Pre-Trained Language Model

  • 论文链接:https://www.aclweb.org/anthology/2020.acl-main.18.pdf
  • 代码链接:https://github.com/czyssrs/Few-Shot-NLG