Spatial Transformer Networks (STN) have been there since 2015 but I haven’t found an easy-to-follow example of it for #Keras.
On the other hand, Kevin Zakka’s implementation of STN is by far one of the cleanest ones but it’s purely in TensorFlow 1. So, I decided to take the utility functions from his implementation and prepare an end-to-end example in #Keras out of it. You can find it here:
Comes with a Colab Notebook and also a TensorBoard callback that helps visualize the progressions of the transformations learned by STN during training.
Notice how the STN module is able to figure out transformations for the dataset that may be helpful to boost its performance -
It’s supported in TensorBoard which is what I have demonstrated in my example above. By the time of that development, it wasn’t supported on tensorboard.dev.