-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Description
Model description
Hello everyone,
I have a use case where I'd like to use a depth estimation model, not just a relative one, but an absolute one. I saw that the relative model onnx-community/depth-anything-v2-small is a conversion of depth-anything/Depth-Anything-V2-Small, so I thought I'd just convert the model myself using the tools you provide.
So I tried using your online conversion tool: https://huggingface.co/spaces/onnx-community/convert-to-onnx
But I keep getting an error:
Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type depth-anything to be natively supported in the ONNX export.
Then, I tried with script/convert.py (I ran venv, installed the dependencies...) and still got the same error.
I spent the afternoon trying everything, searching through issues, testing all the LLMs hoping for a sensible answer, but to avoid smashing my screen, I had to resort to opening an issue here.
I don't understand how you exported onnx-community/depth-anything-v2-small and how to reproduce it. Furthermore, the example in your documentation is not functional (or at least I can't get it to work: python -m scripts.convert --quantize --model_id bert-base-uncased:
ModuleNotFoundError: No module named 'onnxscript'
Thanks for your work for the open-source community
Prerequisites
- The model is supported in Transformers (i.e., listed here)
- The model can be exported to ONNX with Optimum (i.e., listed here)
Additional information
No response
Your contribution
N/A