Skip to content

Code for BERT classifier finetuning for multiclass text classification

Notifications You must be signed in to change notification settings

laurabbb/bert-finetuning-catalyst

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Instruction:

  • install Poetry, and run poetry install to create an environment and install all dependencies (you might need to adapt PyTorch version in pyproject.toml w.r.t. your CUDA version)
  • specify your data, model, and training parameters in config.yml
  • if needed, customize the code for data processing in src/data.py
  • specify your model in src/model.py, by default it's DistilBERT for sequence classification
  • run poetry run python src/train.py

Video-tutorial

I explain the pipeline in detail in a video-tutorial which consists of 4 parts:

Also, see other tutorials/talks on the topic:

About

Code for BERT classifier finetuning for multiclass text classification

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%