最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

Integrating python ML models with Flutter client locally - Stack Overflow

programmeradmin4浏览0评论

I'm having a challenge at work, i'm required to run a lot of python ml models on my client app because of performance and latency issues with making some models run on server.

I have no experince with integarting ml models except for the tensor flow light models in project assets and my coworker who implemented the python models tells me that he can't export some model as tflite models.

There's a package (onnxruntime) that uses ONNX models and uses the functions in them in my flutter code sort of works like dart FFI, I've used this package before to run C++ functions in my flutter code and works great. My cowroker said that i have the same issue he cant extract all model to ONNX model but this made me think is there a way to use python code in my flutter app like dart ffi, i know it wont work the same since python is an interpreted language and i cant make shared objects out of it so my question : is there a way to use python code in my client app or python ml model in my client app without the use of tflite or onnxruntime?

发布评论

评论列表(0)

  1. 暂无评论