nilesh2797
add sentence-transformer support
87008da
|
raw
history blame
429 Bytes

Distilbert encoder models trained on Wikipedia tagging dataset (LF-Wikipedia-500K) using the DEXML (Dual Encoder for eXtreme Multi-Label classification, ICLR'24) method.