Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Load the Model #2

Open
mady143 opened this issue Jul 13, 2020 · 2 comments
Open

Load the Model #2

mady143 opened this issue Jul 13, 2020 · 2 comments

Comments

@mady143
Copy link

mady143 commented Jul 13, 2020

Hi @umair13adil ,

I want to know some information regarding my app just i have created an app in flutter and i deployed a ml model into that app locally and working fine while taking the build its of more than 40 mb size in that my model is of 42mb size so its taking to large so i want to do i want to call my model from the server i was completed don't about that functionality could you help me in that .

Thanks & Regards,
Manikantha Sekhar.

@NeighborhoodCoding
Copy link

NeighborhoodCoding commented Sep 7, 2020

I'm a beginner, so I don't know well, but maybe https://medium.com/analytics-vidhya/deploy-ml-models-using-flask-as-rest-api-and-access-via-flutter-app-7ce63d5c1f3b (Flask?) will help.

@umair13adil
Copy link
Owner

The solution provided by Firebase itself is best suited for TFLite models.
Deploy and manage custom models

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants