Weird tokens gets printed out in llama.cpp

#26
by Noonecares52647842 - opened

I have tested the model and it seems like the model prints "<|start|>assistant<|channel|>final<|message|>" at start of a response.
ảnh.png

Using --jinja fixed it.

Noonecares52647842 changed discussion status to closed

Sign up or log in to comment