最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

python - ONNX - How do I convert ONNX float32 model to bfloat16? - Stack Overflow

programmeradmin4浏览0评论

I have found several documentation and tools for converting ONNX model to float16, but none of them supports converting to bfloat16.

The model is originally trained using tensorflow and converted to ONNX. I do not have access to the original tensorflow code/models.

Do you think I can convert the model to tensorflow then convert back to onnx as bfloat16? Will this quantization "dilutes" the trained model weights?

Please advice.

I have looked into tools for converting onnx models, but only fp16 is found, not bfp16.

发布评论

评论列表(0)

  1. 暂无评论