What's the deal with tf.experimental

When I am referring to official tensorflow tutorials on TensorFlow text processing tutorials  |  Text I see tf. experimental being used at many places. From the name it is clear that these are experimental APIs and something that should not be used in production code as things might change. However I don’t understand why are they so much widely used in official tutorials? Can we have these tutorials built using more stable APIs? Something people can refer to and use them in their production code?

1 Like

Hi @dhavalsays ,

I guess you’re mentioning

tensorflow.keras.layers.experimental

correct?

thanks for the feedback.
Do you have any specific task in mind?

1 Like

You could build your own versions of all these layers using low level string ops, tensorflow_text and tf.lookup.

That approach is verbose and hard to get right. Our overall assessment here was that using these experimental symbols now helps people get the tasks done, with minimal code changes expected in the future. The team that owns these layers is planning to get them out of experimental ASAP. I don’t expect significant changes.

2 Likes

It Is also useful for the API stability as the namespace is exposed earlier to collect more users feedback also from newcomers.

3 Likes

That’s an even better answer than mine :rofl:.

If you find a mistake in the API design, if it’s in experimental we can still fix it. Outside of experimental we have to live with it.

1 Like

yes I am referring to tensorflow.keras.layers.experimental I am going through this tutorial: text_classification_rnn tutorial and I see the experimental layer used. I also see it being used in many other tutorials. When is this experimental api going to become stable? Also I would not feel comfortable deploying this code to production. I also make coding videos on youtube and now when I start seeing experimental it makes me nervous that my coding video will become outdated pretty soon as I am using unstable api… Hope you understand my concern.

1 Like

In general all APIs not under an experimental namespace can only change when changing the major version of TF (like TF 1.x → TF 2.x). We are guaranteeing that code written for TF 2.1 would still work in TF 2.42 if it only uses APIs outside of experimental (there are some exceptions where APIs that are not frequently used but are very broken might break in between minor versions but we will notify this change well in advance)

2 Likes

I think the concern is not so much about the experimental namespace rather its prevalence in the official tutorials on tensorflow.org. And it’s a valid concern especially if you’re building educational resources that need a longer shelf life.

We don’t have an official editorial policy on this other than the content on tensorflow.org represents TensorFlow “as it is”, meaning, the site docs should reflect the latest TF release (though we host older API reference versions and archive old docs). Notebook tutorials are executed and tested regularly to ensure everything works and the site is up-to-date, or at least doesn’t break. Most usage of experimental APIs in the TF docs are an improvement in API usability that, for whatever reason, failed to make the stable window. For an existing doc with an experimental API, you can always look through the docs version branches to see how something was done before the experimental API was introduced.

Generally, if an experimental API is in an official tutorial there’s a strong chance a decision was made this is the best way forward. While tensorflow.org has a lot of content and docs for many packages (some more stable than others), maybe we can be more thoughtful about how we introduce experimental APIs in the “core” guide and tutorials to make sure we provide a solid foundation for the larger TensorFlow ecosystem to build on.

3 Likes

Billy, you conveyed my concern most accurately. I run a youtube channel called codebasics (320k subscribers) and I mainly teach AI, data science etc. In my present deep learning series I am having this feeling of regret that I should have selected PyTorch because when I see experimental all over the places in TF official tutorials I get a feeling that “TF is probably work in progress framework and I should opt for more stable framework such as PyTorch” I have a huge following on youtube and people rely on my videos for their learning as well as their real life usage. I am feeling reluctant to use any experimental API nowadays. I have lot of respect for Google as a company and I really wish you will understand the whole problem from your API consumer standpoint and try to resolve these issues sooner than later.

1 Like

Billy, is there a way to get in touch with you or some other folks who work in tensorflow team? I ran a youtube channel called “codebasics” (320k subscribers) and my ML,deep learning videos are popular on youtube. If someone searches “tensorflow tutorial” in youtube my videos come in first 5 search results. I would like to collaborate with tensorflow team so that I can provide a best quality education to people for free and at the same time market tensorflow framework. My email id is removed Please send me an email and I would like to discuss some collaboration ideas that can even help you guys make your documentation better as well as produce help in video format.

1 Like

Happy to chat! I’ll send you an email in a sec.