Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
jinaai
/
jina-bert-flash-implementation
like
5
Follow
Jina AI
1.51k
Transformers
bert
custom_code
๐ช๐บ Region: EU
Model card
Files
Files and versions
xet
Community
18
Deploy
Use this model
2e2b8d0
jina-bert-flash-implementation
131 kB
6 contributors
History:
73 commits
Markus28
feat: choose flash attention heuristically if not set explicitly
2e2b8d0
almost 2 years ago
bert_padding.py
Safe
9.78 kB
reference the flash attention GitHub
almost 2 years ago
block.py
Safe
17.4 kB
reference the flash attention GitHub
almost 2 years ago
configuration_bert.py
5.76 kB
added classifier dropout
almost 2 years ago
embedding.py
6.43 kB
reference the flash attention GitHub
almost 2 years ago
mha.py
35.3 kB
reference the flash attention GitHub
almost 2 years ago
mlp.py
6.17 kB
reference the flash attention GitHub
almost 2 years ago
modeling_bert.py
28.7 kB
feat: choose flash attention heuristically if not set explicitly
almost 2 years ago
modeling_for_glue.py
Safe
10.7 kB
feat: assert return_dict
almost 2 years ago
modeling_lora.py
7.37 kB
feat: select first LoRA upon initialization
almost 2 years ago
tokenizer.py
3.65 kB
support-multiple-task-ids (#5)
almost 2 years ago