EfficientNetV1-B4-FacesMTL-EXP1

This model is a fine-tuned version of EfficientNetV2-s on faces-mtl. It achieves the following results on the evaluation set:

  • Gender Accuracy: 0.9006
  • Gender F1: 0.8651
  • Age Mae: 6.7354
  • Age Rmse: 9.1662
  • Loss: 84.2761

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Gender Accuracy Gender F1 Age Mae Age Rmse Validation Loss
221.2506 0.1728 150 0.8704 0.7993 10.4148 13.8506 192.2673
163.1323 0.3456 300 0.8637 0.7814 9.4884 12.8120 164.4855
188.8794 0.5184 450 0.8680 0.8299 9.0322 12.2782 151.1256
147.5274 0.6912 600 0.8741 0.8056 8.1886 10.9624 120.4869
121.8239 0.8641 750 0.8879 0.8379 7.8131 10.4025 108.5096
110.5511 1.0369 900 0.8856 0.8386 7.7040 10.3324 107.0565
116.5555 1.2097 1050 0.8741 0.8046 7.4726 9.9674 99.6567
126.6334 1.3825 1200 0.8902 0.8415 7.5891 10.3014 106.3999
147.6252 1.5553 1350 0.8911 0.8509 7.3201 9.8158 96.6354
124.724 1.7281 1500 0.8951 0.8546 7.2476 9.6878 94.1328
107.5 1.9009 1650 0.8897 0.8372 7.0946 9.4325 89.2502
91.8285 2.0737 1800 0.8980 0.8612 7.0833 9.5193 90.9008
94.1933 2.2465 1950 0.8871 0.8302 7.0344 9.4989 90.5074
98.9504 2.4194 2100 0.8928 0.8459 6.9311 9.3160 87.0540
94.4654 2.5922 2250 0.8977 0.8611 6.9284 9.3570 87.8282
85.7435 2.7650 2400 0.8983 0.8634 6.8776 9.3332 87.3804
125.3979 2.9378 2550 0.8989 0.8589 6.8158 9.2204 85.2777
79.05 3.1106 2700 0.8977 0.8555 6.8892 9.3617 87.9025
81.3652 3.2834 2850 0.8954 0.8492 6.7664 9.1391 83.7815
82.3679 3.4562 3000 0.8989 0.8641 6.8370 9.2874 86.5219
83.2362 3.6290 3150 0.8951 0.8620 6.7723 9.1703 84.3717
80.1852 3.8018 3300 0.8995 0.8652 6.6909 9.0639 82.4177
111.4015 3.9747 3450 0.9012 0.8645 6.7183 9.1129 83.3005
76.6393 4.1475 3600 0.9018 0.8635 6.7800 9.1985 84.8666
80.495 4.3203 3750 0.9023 0.8673 6.6952 9.0644 82.4208
104.2716 4.4931 3900 0.9023 0.8655 6.7867 9.2289 85.4275
77.721 4.6659 4050 0.9020 0.8674 6.9182 9.4068 88.7482
70.1717 4.8387 4200 0.9009 0.8621 6.7354 9.1496 83.9694

Framework versions

  • Transformers 4.57.1
  • Pytorch 2.9.0+cu130
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
8
Safetensors
Model size
19.5M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support