Solve this Isuue Please

#37
by deep7654 - opened

import os
from langchain.prompts import PromptTemplate
from langchain_huggingface import HuggingFaceEndpoint

set your HF token

os.environ["HUGGINGFACEHUB_API_TOKEN"] = "hf_xxxxxx"

define the LLM

llm = HuggingFaceEndpoint(
repo_id="deepseek-ai/DeepSeek-V3.1",
task ='conversational',
max_new_tokens=200,
temperature=0.7,
)

prompt template

prompt = PromptTemplate(
input_variables=["item"],
template="Suggest 3 creative company names for a business that sells {item}."
)

chain = prompt | llm

response = chain.invoke({"item": "colorful cloth"})

print(response)

error ---
ValueError Traceback (most recent call last)
/tmp/ipython-input-1999786440.py in <cell line: 0>()
22 chain = prompt | llm
23
---> 24 response = chain.invoke({"item": "colorful cloth"})
25
26 print(response)

8 frames
/usr/local/lib/python3.12/dist-packages/huggingface_hub/inference/_providers/init.py in get_provider_helper(provider, task, model)
205
206 if task not in provider_tasks:
--> 207 raise ValueError(
208 f"Task '{task}' not supported for provider '{provider}'. Available tasks: {list(provider_tasks.keys())}"
209 )

ValueError: Task 'text-generation' not supported for provider 'fireworks-ai'. Available tasks: ['conversational']

Wrap your Huggingface endpoint model with a ChatHuggingFace wrapper. You can check ChatHuggingFace wrapper in langchain documentation.

Sign up or log in to comment