Lachlan Cahill
lcahill
AI & ML interests
None yet
Organizations
None yet
How to use it in PyCharm for auto completion?
👍
3
4
#6 opened 6 months ago
by
DrNicefellow
Max output tokens for Llama 3.1
8
#6 opened over 1 year ago
by
abhirup-sainapse
Training this model
❤️
3
4
#33 opened over 1 year ago
by
ottogutierrez
Why 12b? Who could run that locally?
😔
👍
11
47
#1 opened over 1 year ago
by
kaidu88
GPU requirements
7
#32 opened over 1 year ago
by
jmoneydw
Align tokenizer with mistral-common
3
#39 opened over 1 year ago
by
Rocketknight1
feat/tools-in-chat-template
❤️
1
6
#21 opened over 1 year ago
by
lcahill
You are truly a godsend!
#16 opened over 1 year ago
by
lcahill
thanks and question function-calling.
10
#17 opened over 1 year ago
by
NickyNicky
Not generating [TOOL_CALLS]
3
#20 opened over 1 year ago
by
ShukantP
Examples on usage
4
#7 opened over 1 year ago
by
dgallitelli
Can you add chat template to the `tokenizer_config.json`file?
5
#3 opened almost 2 years ago
by
hdnh2006
add chat template jinja in tokenizer.json
4
#4 opened almost 2 years ago
by
Jaykumaran17
Added Chat Template
1
#5 opened almost 2 years ago
by
lcahill
Adding `safetensors` variant of this model
👍
1
2
#10 opened almost 2 years ago
by
SFconvertbot
Adding `safetensors` variant of this model
#94 opened almost 2 years ago
by
lcahill
Adding `safetensors` variant of this model
❤️
4
4
#42 opened about 2 years ago
by
nth-attempt
Adding `safetensors` variant of this model
❤️
4
4
#42 opened about 2 years ago
by
nth-attempt