📑

AI Paper Research

AI 논문 조사 및 정리

Foundations
자연어처리Natural Language Processing
Training language models to follow instr...Scaling Instruction-Finetuned Language M...
BART: Denoising Sequence-to-Sequence Pre...DeBERTa: Decoding-enhanced BERT with Dis...
Exploring the Limits of Transfer Learnin...XLNet: Generalized Autoregressive Pretra...RoBERTa: A Robustly Optimized BERT Pretr...
Deep contextualized word representations
GloVe: Global Vectors for Word Represent...
Efficient Estimation of Word Representat...
홈/자연어처리/2020

자연어처리 — 2020

2편의 논문

ACL 202010,000+

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

BART: 자연어 생성, 번역, 이해를 위한 디노이징 시퀀스-투-시퀀스 사전학습

Mike Lewis, Yinhan Liu, Naman Goyal et al. (2020)

ICLR 20214,000+

DeBERTa: Decoding-enhanced BERT with Disentangled Attention

DeBERTa: 분리된 어텐션을 활용한 디코딩 강화 BERT

Pengcheng He, Xiaodong Liu, Jianfeng Gao et al. (2020)

← 자연어처리 전체