Inference API returns 404 for all models

I am unable to access any models via the Inference API. All requests return a 404 Not Found error.

  • I have tried multiple models (like distilgpt2 and distilbert-base-uncased-finetuned-sst-2-english).

  • I have generated multiple new “Read” access tokens and they all fail.

  • My curl -v logs show a successful connection, but the server always responds with 404.

  • My username is suvadip007.

Can you please check if my account has any restrictions preventing me from using the Inference API? Thank you.

The Inference API has been revamped, and its usage, endpoint URL, and deployed models have changed significantly. If you don’t modify your code, you may encounter a 404 error.
I think distilbert-base-uncased-finetuned-sst-2-english has been deployed.