Dear contributors and users of TensorFlow Addons,
Over the course of 4 years, 200 contributors have built the TFA repository into a community owned and managed success. I’d like to take a moment to sincerely thank everyone involved as a contributor or community member for their efforts.
Please take a moment to read our plans to wind down this SIG and TFA repository when time allows:
opened 01:40AM - 10 Feb 23 UTC
Dear contributors and users of TensorFlow Addons,
As many of you know, Tensor… Flow Addons (TFA) is a repository of community maintained and contributed extensions for TensorFlow, first created in 2018 and maintained by the [SIG-Addons community](https://github.com/tensorflow/community/blob/master/sigs/addons/CHARTER.md). Over the course of 4 years, 200 contributors have built the TFA repository into a community owned and managed success that is being utilized by over 8,000 github repositories according to our dependency graph. I’d like to take a moment to sincerely thank everyone involved as a contributor or community member for their efforts.
Recently, there has been increasing overlap in contributions and scope between TFA and the [Keras-CV](https://github.com/keras-team/keras-cv) and [Keras-NLP](https://github.com/keras-team/keras-nlp) libraries. To prevent future overlap, we believe that new and existing addons to TensorFlow will be best maintained in Keras project repositories, where possible.
## Decision to Wind Down TensorFlow Addons
We believe that it is in the best interest of the TensorFlow community to consolidate where TensorFlow extensions can be utilized, maintained and contributed. Because of this, it is bittersweet that we are announcing our plans to move TensorFlow Addons to a minimal maintenance and release mode.
TFA SIG Addons will be ending development and introduction of new features to this repository. TFA will be transitioning to a minimal maintenance and release mode for one year in order to give appropriate time for you to adjust any dependencies to the overlapping repositories in our TensorFlow community (Keras, Keras-CV, and Keras-NLP). Going forward, please consider contributing to the [Keras-CV ](https://github.com/keras-team/keras-cv)and [Keras-NLP](https://github.com/keras-team/keras-nlp) projects.
## Background:
The [original RFC proposal](https://github.com/tensorflow/community/blob/master/rfcs/20181214-move-to-addons.md) for TFA was dated 2018-12-14 with the stated goal of building a community managed repository for contributions that conform to well-established API patterns, but implement new functionality not available in core TensorFlow as defined in our [Special Interest Group (SIG) charter](https://github.com/tensorflow/community/blob/master/sigs/addons/CHARTER.md).
As the years have progressed, new repositories with healthy contributor communities (Keras-CV, Keras-NLP, etc.) have been created with similar goals to ours and the criteria for contribution acceptance overlaps significantly (e.g. number of required citations). Additionally, since [Keras split out of core TensorFlow](https://github.com/tensorflow/community/blob/master/rfcs/20200205-standalone-keras-repository.md) in 2020, the barrier for community contribution has been substantially lowered.
Understandably, there has been increasing ambiguity regarding where contributions should land and where they will be best maintained. Many features that are available in TFA are simultaneously available in other TensorFlow Community repositories. As just a few examples:
Random Cutout: [TFA](https://www.tensorflow.org/addons/api_docs/python/tfa/image/random_cutout) & [Keras-CV](https://keras.io/api/keras_cv/layers/preprocessing/random_cutout/)
AdamW Optimizer: [TFA](https://www.tensorflow.org/addons/api_docs/python/tfa/optimizers/AdamW) & [Keras](https://www.tensorflow.org/api_docs/python/tf/keras/optimizers/experimental/AdamW)
Multihead Attention: [TFA](https://www.tensorflow.org/addons/api_docs/python/tfa/layers/MultiHeadAttention) & [Keras](https://www.tensorflow.org/api_docs/python/tf/keras/layers/MultiHeadAttention)
As part of the original RFC, our Special Interest Group agreed to migrate code from tf.contrib and keras.contrib repositories. In doing so, TFA inherited C++ custom-ops, which made TFA a unique place in the TensorFlow community to contribute C++ custom ops to be built and distributed. However, we’ve recently helped in[ migrating much of that infrastructure to Keras-CV](https://github.com/keras-team/keras-cv/pull/890#event-7642725159) so that they can compile and distribute custom ops as they see fit.
## What’s Next:
* One year of maintenance and releases for TFA
* Addition of warnings into the next 0.20 TFA release
* Creating a public document to analyze where overlap already occurs for TFA features with the other repositories.
* [GitHub Tracking Project](https://github.com/tensorflow/addons/projects/5)
This is a bittersweet moment, and again I just want to thank everyone for their efforts to make this repo as successful as it has been.
Sincerely,
Sean Morgan