You would need to use mobilenet directly instead of using our easy to use class wrappers for common models. You can then use model.save() and use local storage for example or to disk, and then load from disk / local storage instead of a URL to something online. To learn how to do this check my course (free) that explains how to load / save models:
Yes my course covers this. Please see the chapter where I take mobile net, load it, from TF-Hub and chop up the layers, retrain a new model and then save the resulting layers model which you could save to localstorage instead using model.save() so then it would work offline if you load from localstorage instead if has been saved already.
Hello. So as this is is a premade JS class by a team at Google you would need to hack that code bundle to reference locally stored cached assets eg model.json / *.bin files for the model it is loading in behind the scenes. I would grab your favourite code editor and check the chrome dev tools network tab when loading in a minimal website that uses it to see what 3P requests are made and then grab those resources and check in the code where they are requested and change them to point to something else that you control instead like localstorage or such.
If you are using TensorFlow.js models directly then you can use the model.save() API to save models to localstorage / indexdb etc using this API:
Given this is a JS class someone else has made though you need to hack the code at that level in this case as it is that code that is dealing with the loading of raw model.json files etc.
Hello,
It’s depending of your architecture.
In my case, I work with a Nodejs application on one side where the Tensorflow model is running, and a web interface on the other side for actions input, wich are sent to nodejs app.
Is it your configuration ?
Best regards.