Tensorflow for android tutorials

I’m working on an android app for object detection, although examples are provided on the tensorflow website, it’s not exactly what I’m looking for, and I want to create this project from scratch. I can’t find any free course or tutorial on the topic, the examples provided in tensorflow.org are too complex for me as a beginner. Other tutorials I find always require to download pre-written code on github, then show how to customize it. Instead of showing how everything works.

HI, for a full tutorial I’d follow this:

There’s a full set of tutorials to go from zero to model deployed. It might be able to help you

Thank you, it’s still saying “how to build an (object detection, image classification …) app”, then inside, they tell you to download all the code from github and make very small modifications to it in the best case. This drives me crazy

Usually on those repositories there’s the full code without the need of the small modifications on a separate folder.

It’s not clear to me exactly what you want. Would that be a tutorial that starts from zero code and build it step by step?
The problem with this approach is that maybe 90% of the work would be related to create the Android app, with a lot of Android specifics and in this case that’s not the goal. The tutorials target how to use ML on your app and that requires some code to be in place already.

Ok I get it, thanks.

I don’t know if I have to create another topic for this question, my question is: If I prefer working in tensorflow instead of tensorflow lite, what will stop me from creating my model in tensorflow, save my model in h5 format, and convert it into tflite format using ```
tf.lite.TFLiteConverter?

That’s the expected way of working.
You build your model using regular TensorFlow and when you need to deploy it on a mobile device you use the conversion tool to do that. Just a minor tip, it’s better to save as a a saved_model format and not necessarily in h5 (Save, serialize, and export models  |  TensorFlow Core)

The only blocker you might have is that not all operations are available on TFLite as it’s meant for inference only and it’s focused on speed. But when converting the model, if there’s any issue the tool will tell you.

1 Like

Yes I suggest to convert your model with a fast fail approach.
So that you have verfied early the that your model could be converted before you will invest too much time on It.

2 Likes

Thank you, I appreciate your help.

1 Like