Skip to content

TeamLab/teamlab-nlp-seminar-2019

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 

Repository files navigation

Deep NLP seminar 2019

개요

TEAMLAB의 Deep NLP 세미나 발표자료를 보관하기 위한 저장소 입니다. 본 세미나는 딥러닝을 활용하여 NLP의 다양한 태스크들을 해결한 문제들에 대해 발표하고 구현하는 것을 목표로 합니다.

발표 및 코드 유의사항

  • 구현에 한해서는 반드시 구현을 포함합니다.
  • 영문 대표 데이터셋과 함께 반드시 한글 데이터셋에 대한 실험을 진행한다.
  • 3개 이상의 데이터셋에 대하여, 실험을 실시하며 데이터 종류에 따른 전처리 코드를 포함한다.
  • 최종산출물 코드는 반드시 py 파일로 작성하되, 실험결과 CSV 형태로 저장하는 자동화된 코드로 생성한다.
  • 코드의 작성은 pytorch로 하되, pytorch의 예제 코드 작성방법에 준하여 코드를 작성한다.
  • 가능한한 zero-base로 작성하되, pytorch-transformer, spaCy, fast-ai 등의 외부코드를 활용할 수 있다면 추가로 작성한다.

일정 및 주제

NLP trends

http://ruder.io/state-of-transfer-learning-in-nlp/

Date Paper Presenter ppt code category
09/18 Sequence to Sequence Learning with Neural Networks Sion Jang ppt code MT
09/18 Neural Machine Translation by Jointly Learning to Align and Translate Jeyoung Kim ppt MT
09/25 Attention Is All You Need Hyeonju Lee LM
09/25 Convolutional Sequence to Sequence Learning Seokkyu Choi ppt code MT
10/02 Multimodal Machine Translation with Embedding Prediction Sion Jang ppt MT
10/02 Deep contextualized word representations Jeyoung Kim LM
10/16 BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding Hyeonju Lee LM
10/16 RoBERTa: A Robustly Optimized BERT Pretraining Approach Seokkyu Choi LM
10/23 OpenKE: An Open Toolkit for Knowledge Embedding Sion Jang ppt QA
10/23 Reading Wikipedia to Answer Open-Domain Questions Jeyoung Kim QA
10/30 Bidirectional Attention Flow for Machine Comprehension Hyeonju Lee QA
10/30 Bidirectional Attentive Memory Networks for Question Answering over Knowledge Bases Seokkyu Choi QA
11/06 RMDL: Random Multimodel Deep Learning for Classification Sion Jang Classification
11/06 MaskGAN: Better Text Generation via Filling in the___ Jeyoung Kim Generation
11/13 Long Text Generation via Adversarial Training with Leaked Information Hyeonju Lee Generation
11/13 Deep Graph Convolutional Encoders for Structured Data to Text Generation Seokkyu Choi Generation
11/20 GPT-2: Language Models are Unsupervised Multitask Learners Sion Jang LM + Generation
11/20 XLNet: Generalized Autoregressive Pretraining for Language Understanding Jeyoung Kim LM + Generation
11/27 Semi-Supervised Sequence Modeling with Cross-View Training Hyeonju Lee NER
11/27 SciBERT: Pretrained Contextualized Embeddings for Scientific Text Seokkyu Choi RE
12/04 BERT for Coreference Resolution: Baselines and Analysis Sion Jang CR
12/04 Sense Vocabulary Compression through the Semantic Knowledge of WordNet for Neural Word Sense Disambiguation Jeyoung Kim WSA
12/11 CRIM at SemEval-2018 Task 9: A Hybrid Approach to Hypernym Discovery Hyeonju Lee Hypernym

References

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •