Skip to content

Latest commit

 

History

History
14 lines (12 loc) · 579 Bytes

File metadata and controls

14 lines (12 loc) · 579 Bytes

Toxic Speech Detection

This project aims to solve the Toxic Comment Classification Challenge on Kaggle.

Setup

In order to be able to work in this project, you must first download the data from Kaggle.
Download the data from here, place it in /data/kaggle/ directory and unzip it.

You should have 4 files to work with:

  • sample_submission.csv
  • test.csv
  • test_labels.csv
  • train.csv