Duplicated from burtenshaw/code_quiz
Reported on /static-proxy?url=https%3A%2F%2Fdiscuss.huggingface.co%2Ft%2Fbug-unit-2-1-quiz-410-client-error-legacy-api-endpoint%2F173189%3C%2Fa%3E%3C%2Fp%3E Summary The Space currently instantiates InferenceClient(model=..., token=...) which can hit the legacyapi-inference.huggingface.co/models/... path in older/legacy configurations and now fails with HTTP 410 Gone. This PR makes the smallest possible change to force routing via Inference Providers. Changes app.py Change one line:from: client = InferenceClient(model=HF_API_URL, token=HF_TOKEN)to: client = InferenceClient(model=HF_API_URL, provider="auto", token=HF_TOKEN) requirements.txt Bump:huggingface-hub==0.28.1 -> huggingface-hub==1.4.1 Notes for maintainers The Space secret HF_TOKEN must be a token that is allowed to call Inference Providers. provider="auto" uses the account’s provider preference order and can fail over if multiple providers support the model.
Summary
InferenceClient(model=..., token=...)
api-inference.huggingface.co/models/...
Changes
app.py
requirements.txt
Notes for maintainers
HF_TOKEN
provider="auto"
· Sign up or log in to comment