BERT finetuned on Banking 77 Open Intent Classification Dataset

This is a BERT model finetuned on the PolyAI/banking77 dataset.

Training Configuration

  • PRETRAINED_MODEL_NAME = "bert-base-uncased"
  • BATCH_SIZE = 128
  • LR_PRETRAIN = 2e-5
  • EPOCHS_PRETRAIN = 20
  • DEVICE = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
  • MAX_LEN = 128, to truncate long sequences down to 128 tokens, or pad short sequences up to 128 tokens
Downloads last month
6
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for KaiquanMah/BERT-Banking77-OpenIntentClassification

Finetuned
(5923)
this model

Dataset used to train KaiquanMah/BERT-Banking77-OpenIntentClassification