Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Comment Toxicity classifier #17

Open
Shreesh90 opened this issue Oct 28, 2020 · 4 comments
Open

Comment Toxicity classifier #17

Shreesh90 opened this issue Oct 28, 2020 · 4 comments
Assignees

Comments

@Shreesh90
Copy link
Contributor

Feature Request

Describe your problem

Train a model to identify the different types of toxicity level in a comment. Create a separate folder and add all the files.
Dataset - https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge/data

@welcome
Copy link

welcome bot commented Oct 28, 2020

Hello there!👋 Elevate-Lab heartly😃 welcomes you to the project!💖

Thank you and congrats🎉for opening your very first issue in this project.Hope you have a great time there!😄
You may submit a PR if you like! If you want to report a bug🐞 please follow our Issue Template. Also make sure you include steps to reproduce it and be patient while we get back to you.😄

@Shreesh90
Copy link
Contributor Author

Assign this issue to me. I will work on this.

@uglyprincess
Copy link
Collaborator

Assign this issue to me. I will work on this.

Sure @Shreesh90 ! 😊

@create-issue-branch
Copy link

Branch issue-17-Comment_Toxicity_classifier created!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants