Hi, I am practicing a lab exercise around Encoder-Decoder Architecture. Here are some bunch of sequence commands I am doing hands-on on my system. One of them is a map function where I got stuck and I came across something to report to TensorFlow team as per the error statement .
- all_ids = # insert code here
2.ids_dataset = tf.data.Dataset.from_tensor_slices(all_ids)
3.sequences = ids_dataset.batch(seq_length + 1, drop_remainder=True)
Now I am creating some training example of dataset of (input, label)
pairs. Where input
and label
are sequences. At each time step the input is the current character and the label is the next character.
Here’s a function that takes a sequence as input, duplicates, and shifts it to align the input and label for each timestep:
Now here is a command below
def split_input_target(sequence):
input_text = sequence[:-1]
target_text = sequence[1:]
return input_text, target_text
split_input_target(list(“Tensorflow”))
dataset=sequences.map(split_input_target)
I AM GETTING AN ERROR BELOW
[THOUGHT TO REPORT IT TO TENSORFLOW TEAM]
WARNING:tensorflow:AutoGraph could not transform <function split_input_target at 0x0000014132362268> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, export AUTOGRAPH_VERBOSITY=10
) and attach the full output.
Cause: Unable to locate the source code of <function split_input_target at 0x0000014132362268>. Note that functions defined in certain environments, like the interactive Python shell, do not expose their source code. If that is the case, you should define them in a .py source file. If you are certain the code is graph-compatible, wrap the call using @tf.autograph.experimental.do_not_convert. Original error: could not get source code
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
WARNING: AutoGraph could not transform <function split_input_target at 0x0000014132362268> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, export AUTOGRAPH_VERBOSITY=10
) and attach the full output.
Team, please assist if its a blocker.