You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Niels here from the open-source team at Hugging Face. Congrats on your work! I found it based on the paper page: https://huggingface.co/papers/2405.18425, which already has a linked model repository.
However, I've got some suggestions regarding how to improve the integration with HF.
1. Make download stats work
Currently, download stats aren't tracked for your models. The easiest way to fix that is by leveraging the PyTorchModelHubMixin class, as it adds push_to_hub and from_pretrained capabilities to any custom nn.Module. It creates a config.json along with safetensors for each model, enforcing downloads to work.
Alternatively, a PR could be opened to the huggingface.js open-source library as explained here.
2. Make the model Transformers compatible
In case you want your models to be usable through the Transformers library with trust_remote_code=True, I highly recommend following this guide: https://huggingface.co/docs/transformers/custom_models. It basically allows people to use your backbones using the AutoModel and AutoModelForImageClassification APIs.
Hi,
Niels here from the open-source team at Hugging Face. Congrats on your work! I found it based on the paper page: https://huggingface.co/papers/2405.18425, which already has a linked model repository.
However, I've got some suggestions regarding how to improve the integration with HF.
1. Make download stats work
Currently, download stats aren't tracked for your models. The easiest way to fix that is by leveraging the PyTorchModelHubMixin class, as it adds push_to_hub and from_pretrained capabilities to any custom nn.Module. It creates a
config.json
along withsafetensors
for each model, enforcing downloads to work.Alternatively, a PR could be opened to the huggingface.js open-source library as explained here.
2. Make the model Transformers compatible
In case you want your models to be usable through the Transformers library with
trust_remote_code=True
, I highly recommend following this guide: https://huggingface.co/docs/transformers/custom_models. It basically allows people to use your backbones using theAutoModel
andAutoModelForImageClassification
APIs.We recently did the same with the MambaVision author as can be seen here: https://huggingface.co/collections/nvidia/mambavision-66943871a6b36c9e78b327d3.
Let me know if you need any help regarding this!
Cheers,
Niels
ML Engineer @ HF 🤗
The text was updated successfully, but these errors were encountered: