AuraAeternum-20B

This model is a fine-tuned version of openai/gpt-oss-20b trained on Together.ai platform.

Model Details

Model Description

  • Developed by: 0xjesus
  • Model type: Causal Language Model
  • Language(s): English
  • License: Apache 2.0
  • Fine-tuned from model: openai/gpt-oss-20b

Training Details

  • Training Platform: Together.ai
  • Job ID: ft-efd136f0-d9da
  • Base Model: openai/gpt-oss-20b

Uses

Direct Use

This model can be used for text generation tasks, including:

  • Creative writing
  • Text completion
  • Conversational AI
  • Content generation

Out-of-Scope Use

Users should avoid using this model for:

  • Generating harmful or misleading content
  • Making critical decisions without human oversight
  • Applications requiring factual accuracy without verification

How to Use

Using Transformers

from transformers import AutoModelForCausalLM, AutoTokenizer

# Load model and tokenizer
model = AutoModelForCausalLM.from_pretrained("0xjesus/AuraAeternum-20B")
tokenizer = AutoTokenizer.from_pretrained("0xjesus/AuraAeternum-20B")

# Generate text
prompt = "Once upon a time"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(
    **inputs, 
    max_length=100,
    temperature=0.7,
    do_sample=True,
    top_p=0.9
)
generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(generated_text)

Using Together.ai API

from together import Together

client = Together(api_key="YOUR_API_KEY")

response = client.chat.completions.create(
    model="0xjesus/AuraAeternum-20B",
    messages=[{"role": "user", "content": "Hello! How are you?"}],
)
print(response.choices[0].message.content)

Limitations and Biases

  • This model inherits the limitations and biases present in the base model (openai/gpt-oss-20b)
  • The model's outputs should be carefully reviewed for accuracy and appropriateness
  • Performance may vary depending on the specific use case

Citation

If you use this model in your research, please cite:

@misc{AuraAeternum_20B_2024,
  author = {0xjesus},
  title = {AuraAeternum-20B},
  year = {2024},
  publisher = {Hugging Face},
  howpublished = {\url{https://huggingface.co/0xjesus/AuraAeternum-20B}}
}

Contact

For questions or feedback about this model, please open an issue on the model repository.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for 0xjesus/AuraAeternum-20B

Base model

openai/gpt-oss-20b
Finetuned
(360)
this model