nerui-seq_bn-rf64-4

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0385
  • Location Precision: 0.9238
  • Location Recall: 0.9417
  • Location F1: 0.9327
  • Location Number: 103
  • Organization Precision: 0.9128
  • Organization Recall: 0.9181
  • Organization F1: 0.9155
  • Organization Number: 171
  • Person Precision: 0.9771
  • Person Recall: 0.9771
  • Person F1: 0.9771
  • Person Number: 131
  • Overall Precision: 0.9363
  • Overall Recall: 0.9432
  • Overall F1: 0.9397
  • Overall Accuracy: 0.9878

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.0711 1.0 96 0.6462 0.0 0.0 0.0 103 0.0 0.0 0.0 171 0.0 0.0 0.0 131 0.0 0.0 0.0 0.8373
0.5955 2.0 192 0.4464 0.0 0.0 0.0 103 0.28 0.0409 0.0714 171 0.3158 0.0916 0.1420 131 0.2969 0.0469 0.0810 0.8453
0.4337 3.0 288 0.3274 0.3889 0.1359 0.2014 103 0.3756 0.4503 0.4096 171 0.3383 0.5191 0.4096 131 0.3597 0.3926 0.3754 0.8934
0.3506 4.0 384 0.2830 0.4737 0.3495 0.4022 103 0.4416 0.5965 0.5075 171 0.3971 0.6183 0.4836 131 0.4286 0.5407 0.4782 0.9152
0.3082 5.0 480 0.2468 0.4810 0.3689 0.4176 103 0.5219 0.6959 0.5965 171 0.5114 0.6870 0.5863 131 0.5114 0.6099 0.5563 0.9310
0.2709 6.0 576 0.2129 0.4948 0.4660 0.4800 103 0.5708 0.7544 0.6499 171 0.5906 0.7710 0.6689 131 0.5628 0.6864 0.6185 0.9420
0.2357 7.0 672 0.1773 0.5354 0.5146 0.5248 103 0.6415 0.7953 0.7102 171 0.75 0.8702 0.8057 131 0.6544 0.7481 0.6982 0.9544
0.1989 8.0 768 0.1463 0.68 0.6602 0.6700 103 0.7513 0.8304 0.7889 171 0.8380 0.9084 0.8718 131 0.7633 0.8123 0.7871 0.9658
0.1699 9.0 864 0.1226 0.7143 0.7282 0.7212 103 0.7778 0.8596 0.8167 171 0.8652 0.9313 0.8971 131 0.7908 0.8494 0.8190 0.9696
0.1429 10.0 960 0.1061 0.7838 0.8447 0.8131 103 0.8232 0.8713 0.8466 171 0.9051 0.9466 0.9254 131 0.8392 0.8889 0.8633 0.9743
0.1286 11.0 1056 0.0951 0.8165 0.8641 0.8396 103 0.8034 0.8363 0.8195 171 0.9124 0.9542 0.9328 131 0.8420 0.8815 0.8613 0.9751
0.1195 12.0 1152 0.0885 0.7946 0.8641 0.8279 103 0.8066 0.8538 0.8295 171 0.9130 0.9618 0.9368 131 0.8376 0.8914 0.8636 0.9749
0.1123 13.0 1248 0.0820 0.8182 0.8738 0.8451 103 0.8278 0.8713 0.8490 171 0.9130 0.9618 0.9368 131 0.8528 0.9012 0.8764 0.9771
0.1048 14.0 1344 0.0789 0.8349 0.8835 0.8585 103 0.8266 0.8363 0.8314 171 0.9333 0.9618 0.9474 131 0.8633 0.8889 0.8759 0.9782
0.0992 15.0 1440 0.0732 0.8585 0.8835 0.8708 103 0.8324 0.8713 0.8514 171 0.9197 0.9618 0.9403 131 0.8673 0.9037 0.8851 0.9776
0.0947 16.0 1536 0.0708 0.8505 0.8835 0.8667 103 0.8054 0.8713 0.8371 171 0.9478 0.9695 0.9585 131 0.8615 0.9062 0.8833 0.9779
0.0925 17.0 1632 0.0685 0.9010 0.8835 0.8922 103 0.8362 0.8655 0.8506 171 0.9265 0.9618 0.9438 131 0.8816 0.9012 0.8913 0.9790
0.0882 18.0 1728 0.0656 0.8692 0.9029 0.8857 103 0.8605 0.8655 0.8630 171 0.9478 0.9695 0.9585 131 0.8910 0.9086 0.8998 0.9796
0.0837 19.0 1824 0.0655 0.8692 0.9029 0.8857 103 0.8497 0.8596 0.8547 171 0.9338 0.9695 0.9513 131 0.8822 0.9062 0.8940 0.9787
0.0834 20.0 1920 0.0630 0.8785 0.9126 0.8952 103 0.8427 0.8772 0.8596 171 0.9478 0.9695 0.9585 131 0.8854 0.9160 0.9005 0.9796
0.0783 21.0 2016 0.0605 0.8704 0.9126 0.8910 103 0.8671 0.8772 0.8721 171 0.9478 0.9695 0.9585 131 0.8940 0.9160 0.9049 0.9807
0.0769 22.0 2112 0.0590 0.9048 0.9223 0.9135 103 0.8619 0.9123 0.8864 171 0.9338 0.9695 0.9513 131 0.8957 0.9333 0.9141 0.9812
0.0754 23.0 2208 0.0569 0.9048 0.9223 0.9135 103 0.8693 0.8947 0.8818 171 0.9407 0.9695 0.9549 131 0.9014 0.9259 0.9135 0.9820
0.0742 24.0 2304 0.0588 0.8716 0.9223 0.8962 103 0.8655 0.8655 0.8655 171 0.9338 0.9695 0.9513 131 0.8894 0.9136 0.9013 0.9798
0.0726 25.0 2400 0.0568 0.9048 0.9223 0.9135 103 0.8728 0.8830 0.8779 171 0.9338 0.9695 0.9513 131 0.9010 0.9210 0.9109 0.9809
0.0693 26.0 2496 0.0547 0.9048 0.9223 0.9135 103 0.8882 0.8830 0.8856 171 0.9270 0.9695 0.9478 131 0.9053 0.9210 0.9131 0.9826
0.07 27.0 2592 0.0528 0.9126 0.9126 0.9126 103 0.8619 0.9123 0.8864 171 0.9407 0.9695 0.9549 131 0.8998 0.9309 0.9150 0.9832
0.069 28.0 2688 0.0520 0.9135 0.9223 0.9179 103 0.8771 0.9181 0.8971 171 0.9407 0.9695 0.9549 131 0.9067 0.9358 0.9210 0.9840
0.0652 29.0 2784 0.0530 0.9223 0.9223 0.9223 103 0.8588 0.8889 0.8736 171 0.9478 0.9695 0.9585 131 0.9034 0.9235 0.9133 0.9834
0.0638 30.0 2880 0.0522 0.9223 0.9223 0.9223 103 0.875 0.9006 0.8876 171 0.9338 0.9695 0.9513 131 0.9060 0.9284 0.9171 0.9837
0.0642 31.0 2976 0.0512 0.9223 0.9223 0.9223 103 0.8693 0.8947 0.8818 171 0.9407 0.9695 0.9549 131 0.9058 0.9259 0.9158 0.9832
0.0613 32.0 3072 0.0493 0.9143 0.9320 0.9231 103 0.9023 0.9181 0.9101 171 0.9407 0.9695 0.9549 131 0.9179 0.9383 0.9280 0.9848
0.0592 33.0 3168 0.0479 0.9135 0.9223 0.9179 103 0.9070 0.9123 0.9096 171 0.9478 0.9695 0.9585 131 0.9220 0.9333 0.9276 0.9856
0.0578 34.0 3264 0.0478 0.9216 0.9126 0.9171 103 0.9017 0.9123 0.9070 171 0.9407 0.9695 0.9549 131 0.9195 0.9309 0.9252 0.9845
0.0586 35.0 3360 0.0482 0.9126 0.9126 0.9126 103 0.8902 0.9006 0.8953 171 0.9407 0.9695 0.9549 131 0.9124 0.9259 0.9191 0.9837
0.0588 36.0 3456 0.0460 0.9126 0.9126 0.9126 103 0.9064 0.9064 0.9064 171 0.9407 0.9695 0.9549 131 0.9193 0.9284 0.9238 0.9848
0.0568 37.0 3552 0.0462 0.92 0.8932 0.9064 103 0.8820 0.9181 0.8997 171 0.9478 0.9695 0.9585 131 0.9126 0.9284 0.9204 0.9845
0.0554 38.0 3648 0.0465 0.9057 0.9320 0.9187 103 0.8947 0.8947 0.8947 171 0.9478 0.9695 0.9585 131 0.9148 0.9284 0.9216 0.9851
0.0529 39.0 3744 0.0439 0.9057 0.9320 0.9187 103 0.9017 0.9123 0.9070 171 0.9478 0.9695 0.9585 131 0.9177 0.9358 0.9267 0.9856
0.0528 40.0 3840 0.0452 0.9126 0.9126 0.9126 103 0.9118 0.9064 0.9091 171 0.9478 0.9695 0.9585 131 0.9238 0.9284 0.9261 0.9851
0.0524 41.0 3936 0.0431 0.9216 0.9126 0.9171 103 0.8977 0.9240 0.9107 171 0.9478 0.9695 0.9585 131 0.9199 0.9358 0.9278 0.9859
0.0529 42.0 4032 0.0419 0.9216 0.9126 0.9171 103 0.8827 0.9240 0.9029 171 0.9478 0.9695 0.9585 131 0.9133 0.9358 0.9244 0.9862
0.0513 43.0 4128 0.0432 0.9135 0.9223 0.9179 103 0.8902 0.9006 0.8953 171 0.9478 0.9695 0.9585 131 0.9148 0.9284 0.9216 0.9854
0.0488 44.0 4224 0.0448 0.9048 0.9223 0.9135 103 0.8830 0.8830 0.8830 171 0.9478 0.9695 0.9585 131 0.9098 0.9210 0.9153 0.9843
0.0498 45.0 4320 0.0427 0.9038 0.9126 0.9082 103 0.8895 0.8947 0.8921 171 0.9549 0.9695 0.9621 131 0.9144 0.9235 0.9189 0.9848
0.0487 46.0 4416 0.0428 0.9231 0.9320 0.9275 103 0.9064 0.9064 0.9064 171 0.9478 0.9695 0.9585 131 0.9242 0.9333 0.9287 0.9859
0.0478 47.0 4512 0.0410 0.9216 0.9126 0.9171 103 0.8736 0.9298 0.9008 171 0.9478 0.9695 0.9585 131 0.9091 0.9383 0.9235 0.9865
0.0483 48.0 4608 0.0430 0.9048 0.9223 0.9135 103 0.8779 0.8830 0.8805 171 0.9549 0.9695 0.9621 131 0.9098 0.9210 0.9153 0.9854
0.0449 49.0 4704 0.0408 0.9238 0.9417 0.9327 103 0.9023 0.9181 0.9101 171 0.9478 0.9695 0.9585 131 0.9225 0.9407 0.9315 0.9870
0.0444 50.0 4800 0.0418 0.8952 0.9126 0.9038 103 0.9006 0.9006 0.9006 171 0.9549 0.9695 0.9621 131 0.9169 0.9259 0.9214 0.9859
0.0466 51.0 4896 0.0413 0.9238 0.9417 0.9327 103 0.9112 0.9006 0.9059 171 0.9478 0.9695 0.9585 131 0.9265 0.9333 0.9299 0.9862
0.0444 52.0 4992 0.0413 0.9216 0.9126 0.9171 103 0.8857 0.9064 0.8960 171 0.9621 0.9695 0.9658 131 0.9193 0.9284 0.9238 0.9865
0.044 53.0 5088 0.0415 0.9231 0.9320 0.9275 103 0.9059 0.9006 0.9032 171 0.9478 0.9695 0.9585 131 0.9240 0.9309 0.9274 0.9859
0.0432 54.0 5184 0.0404 0.9307 0.9126 0.9216 103 0.8743 0.9357 0.9040 171 0.9549 0.9695 0.9621 131 0.9137 0.9407 0.9270 0.9865
0.0447 55.0 5280 0.0416 0.9231 0.9320 0.9275 103 0.9006 0.9006 0.9006 171 0.9549 0.9695 0.9621 131 0.9240 0.9309 0.9274 0.9865
0.0437 56.0 5376 0.0413 0.9412 0.9320 0.9366 103 0.9123 0.9123 0.9123 171 0.9549 0.9695 0.9621 131 0.9335 0.9358 0.9346 0.9867
0.0428 57.0 5472 0.0402 0.9320 0.9320 0.9320 103 0.9080 0.9240 0.9159 171 0.9549 0.9695 0.9621 131 0.9293 0.9407 0.9350 0.9873
0.042 58.0 5568 0.0409 0.9320 0.9320 0.9320 103 0.9118 0.9064 0.9091 171 0.9621 0.9695 0.9658 131 0.9333 0.9333 0.9333 0.9873
0.0397 59.0 5664 0.0393 0.9406 0.9223 0.9314 103 0.8920 0.9181 0.9049 171 0.9407 0.9695 0.9549 131 0.9199 0.9358 0.9278 0.9870
0.041 60.0 5760 0.0401 0.9320 0.9320 0.9320 103 0.9231 0.9123 0.9176 171 0.9621 0.9695 0.9658 131 0.9381 0.9358 0.9370 0.9876
0.0424 61.0 5856 0.0409 0.9412 0.9320 0.9366 103 0.9226 0.9064 0.9145 171 0.9549 0.9695 0.9621 131 0.9380 0.9333 0.9356 0.9873
0.0423 62.0 5952 0.0406 0.9151 0.9417 0.9282 103 0.9053 0.8947 0.9000 171 0.9621 0.9695 0.9658 131 0.9263 0.9309 0.9286 0.9873
0.0401 63.0 6048 0.0393 0.9245 0.9515 0.9378 103 0.9235 0.9181 0.9208 171 0.9621 0.9695 0.9658 131 0.9363 0.9432 0.9397 0.9878
0.0382 64.0 6144 0.0397 0.9417 0.9417 0.9417 103 0.9286 0.9123 0.9204 171 0.9621 0.9695 0.9658 131 0.9429 0.9383 0.9406 0.9881
0.0397 65.0 6240 0.0386 0.9412 0.9320 0.9366 103 0.9138 0.9298 0.9217 171 0.9697 0.9771 0.9734 131 0.9387 0.9457 0.9422 0.9881
0.0397 66.0 6336 0.0405 0.9238 0.9417 0.9327 103 0.9281 0.9064 0.9172 171 0.9621 0.9695 0.9658 131 0.9381 0.9358 0.9370 0.9873
0.0387 67.0 6432 0.0412 0.9231 0.9320 0.9275 103 0.9053 0.8947 0.9000 171 0.9621 0.9695 0.9658 131 0.9284 0.9284 0.9284 0.9865
0.0398 68.0 6528 0.0391 0.9320 0.9320 0.9320 103 0.9133 0.9240 0.9186 171 0.9695 0.9695 0.9695 131 0.9361 0.9407 0.9384 0.9876
0.0381 69.0 6624 0.0395 0.9320 0.9320 0.9320 103 0.9181 0.9181 0.9181 171 0.9621 0.9695 0.9658 131 0.9360 0.9383 0.9371 0.9876
0.0359 70.0 6720 0.0392 0.9320 0.9320 0.9320 103 0.9128 0.9181 0.9155 171 0.9695 0.9695 0.9695 131 0.9360 0.9383 0.9371 0.9873
0.0368 71.0 6816 0.0387 0.9406 0.9223 0.9314 103 0.8889 0.9357 0.9117 171 0.9695 0.9695 0.9695 131 0.9272 0.9432 0.9351 0.9867
0.0372 72.0 6912 0.0409 0.9412 0.9320 0.9366 103 0.9123 0.9123 0.9123 171 0.9621 0.9695 0.9658 131 0.9358 0.9358 0.9358 0.9881
0.0362 73.0 7008 0.0389 0.9314 0.9223 0.9268 103 0.8876 0.9240 0.9054 171 0.9695 0.9695 0.9695 131 0.9246 0.9383 0.9314 0.9862
0.0386 74.0 7104 0.0395 0.9314 0.9223 0.9268 103 0.8876 0.9240 0.9054 171 0.9695 0.9695 0.9695 131 0.9246 0.9383 0.9314 0.9862
0.0366 75.0 7200 0.0399 0.9412 0.9320 0.9366 103 0.9235 0.9181 0.9208 171 0.9621 0.9695 0.9658 131 0.9406 0.9383 0.9394 0.9881
0.0363 76.0 7296 0.0393 0.9314 0.9223 0.9268 103 0.9133 0.9240 0.9186 171 0.9697 0.9771 0.9734 131 0.9361 0.9407 0.9384 0.9878
0.0355 77.0 7392 0.0390 0.9406 0.9223 0.9314 103 0.9128 0.9181 0.9155 171 0.9621 0.9695 0.9658 131 0.9358 0.9358 0.9358 0.9878
0.0367 78.0 7488 0.0390 0.9406 0.9223 0.9314 103 0.9075 0.9181 0.9128 171 0.9771 0.9771 0.9771 131 0.9383 0.9383 0.9383 0.9876
0.0342 79.0 7584 0.0393 0.9406 0.9223 0.9314 103 0.9128 0.9181 0.9155 171 0.9621 0.9695 0.9658 131 0.9358 0.9358 0.9358 0.9876
0.0344 80.0 7680 0.0400 0.9314 0.9223 0.9268 103 0.9128 0.9181 0.9155 171 0.9621 0.9695 0.9658 131 0.9335 0.9358 0.9346 0.9873
0.0346 81.0 7776 0.0388 0.9314 0.9223 0.9268 103 0.8977 0.9240 0.9107 171 0.9621 0.9695 0.9658 131 0.9268 0.9383 0.9325 0.9867
0.0342 82.0 7872 0.0391 0.9314 0.9223 0.9268 103 0.8977 0.9240 0.9107 171 0.9621 0.9695 0.9658 131 0.9268 0.9383 0.9325 0.9870
0.0343 83.0 7968 0.0387 0.9406 0.9223 0.9314 103 0.9075 0.9181 0.9128 171 0.9621 0.9695 0.9658 131 0.9335 0.9358 0.9346 0.9873
0.0347 84.0 8064 0.0384 0.9223 0.9223 0.9223 103 0.8920 0.9181 0.9049 171 0.9695 0.9695 0.9695 131 0.9244 0.9358 0.9301 0.9862
0.035 85.0 8160 0.0396 0.9314 0.9223 0.9268 103 0.9123 0.9123 0.9123 171 0.9621 0.9695 0.9658 131 0.9333 0.9333 0.9333 0.9876
0.0315 86.0 8256 0.0385 0.9223 0.9223 0.9223 103 0.9023 0.9181 0.9101 171 0.9771 0.9771 0.9771 131 0.9314 0.9383 0.9348 0.9873
0.0344 87.0 8352 0.0387 0.9406 0.9223 0.9314 103 0.9128 0.9181 0.9155 171 0.9621 0.9695 0.9658 131 0.9358 0.9358 0.9358 0.9876
0.035 88.0 8448 0.0385 0.9412 0.9320 0.9366 103 0.9191 0.9298 0.9244 171 0.9771 0.9771 0.9771 131 0.9433 0.9457 0.9445 0.9887
0.0358 89.0 8544 0.0384 0.9223 0.9223 0.9223 103 0.8971 0.9181 0.9075 171 0.9771 0.9771 0.9771 131 0.9291 0.9383 0.9337 0.9870
0.0345 90.0 8640 0.0384 0.9223 0.9223 0.9223 103 0.8971 0.9181 0.9075 171 0.9771 0.9771 0.9771 131 0.9291 0.9383 0.9337 0.9870
0.0336 91.0 8736 0.0384 0.9238 0.9417 0.9327 103 0.9070 0.9123 0.9096 171 0.9697 0.9771 0.9734 131 0.9315 0.9407 0.9361 0.9876
0.0334 92.0 8832 0.0385 0.9151 0.9417 0.9282 103 0.9176 0.9123 0.9150 171 0.9771 0.9771 0.9771 131 0.9361 0.9407 0.9384 0.9878
0.0328 93.0 8928 0.0383 0.9406 0.9223 0.9314 103 0.9080 0.9240 0.9159 171 0.9771 0.9771 0.9771 131 0.9384 0.9407 0.9396 0.9878
0.0332 94.0 9024 0.0387 0.9327 0.9417 0.9372 103 0.9181 0.9181 0.9181 171 0.9621 0.9695 0.9658 131 0.9361 0.9407 0.9384 0.9881
0.0331 95.0 9120 0.0386 0.9238 0.9417 0.9327 103 0.9181 0.9181 0.9181 171 0.9771 0.9771 0.9771 131 0.9386 0.9432 0.9409 0.9884
0.0322 96.0 9216 0.0387 0.9327 0.9417 0.9372 103 0.9181 0.9181 0.9181 171 0.9697 0.9771 0.9734 131 0.9386 0.9432 0.9409 0.9881
0.0341 97.0 9312 0.0387 0.9238 0.9417 0.9327 103 0.9070 0.9123 0.9096 171 0.9697 0.9771 0.9734 131 0.9315 0.9407 0.9361 0.9876
0.0331 98.0 9408 0.0387 0.9238 0.9417 0.9327 103 0.9070 0.9123 0.9096 171 0.9697 0.9771 0.9734 131 0.9315 0.9407 0.9361 0.9878
0.0319 99.0 9504 0.0386 0.9238 0.9417 0.9327 103 0.9070 0.9123 0.9096 171 0.9697 0.9771 0.9734 131 0.9315 0.9407 0.9361 0.9876
0.0329 100.0 9600 0.0385 0.9238 0.9417 0.9327 103 0.9128 0.9181 0.9155 171 0.9771 0.9771 0.9771 131 0.9363 0.9432 0.9397 0.9878

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for apwic/nerui-seq_bn-rf64-4

Finetuned
(386)
this model

Evaluation results