Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

jmajkutewicz
/
zephyr-7b-dpo_hh-rlhf

Text Generation
PEFT
Safetensors
English
mistral
lora
dpo
alignment
conversational
Model card Files Files and versions
xet
Community
zephyr-7b-dpo_hh-rlhf
337 MB
  • 1 contributor
History: 2 commits
jmajkutewicz's picture
jmajkutewicz
Upload folder using huggingface_hub
4651cca verified 2 months ago
  • .gitattributes
    1.52 kB
    initial commit 2 months ago
  • README.md
    1.38 kB
    Upload folder using huggingface_hub 2 months ago
  • adapter_config.json
    740 Bytes
    Upload folder using huggingface_hub 2 months ago
  • adapter_model.safetensors
    336 MB
    xet
    Upload folder using huggingface_hub 2 months ago
  • config.json
    654 Bytes
    Upload folder using huggingface_hub 2 months ago
  • special_tokens_map.json
    551 Bytes
    Upload folder using huggingface_hub 2 months ago
  • tokenizer.json
    1.8 MB
    Upload folder using huggingface_hub 2 months ago
  • tokenizer_config.json
    1.42 kB
    Upload folder using huggingface_hub 2 months ago