I’m getting this error when I fit my model.
tensorflow/core/tpu/kernels/tpu_compilation_cache_external.cc:112] Asked to propagate a dynamic dimension from hlo transpose.3750@{}@0 to hlo %all-reduce.3755 = f32[<=70,256]{1,0} all-reduce(f32[<=70,256]{1,0} %transpose.3750), replica_groups={{0,1,2,3,4,5,6,7}}, to_apply=%sum.3751, metadata={op_type="CrossReplicaSum" op_name="CrossReplicaSum_33" source_file="dummy_file_name" source_line=10}, which is not implemented.
1013 tpu_program_group.cc:90] Check failed: xla_tpu_programs.size() > 0 (0 vs. 0)
However I am passing explicit input shapes like so:
Config.COMPUTED_BATCH_SIZE = 128
with strategy.scope():
model = my_model()
input_shapes = [
[Config.COMPUTED_BATCH_SIZE, 192],
[Config.COMPUTED_BATCH_SIZE, 192],
[COMPUTED_CHANNELS, 105, 129, 100],
[COMPUTED_CHANNELS, 105, 129, 100],
[COMPUTED_CHANNELS, 105, 129, 100],
[Config.COMPUTED_BATCH_SIZE, 70],
[Config.COMPUTED_BATCH_SIZE, 320]
]
model.build(input_shape=input_shapes)
Edit:
I’ve tracked it down to this code:
hidden_size = 128
self.descriptor_embedding = layers.Dense(
hidden_size * 2, # 256
activation='relu',
input_shape=(Config.COMPUTED_BATCH_SIZE, 70)
)
learned_descriptors = tf.expand_dims(
self.descriptor_embedding(descriptors),
1
) # [BS, 1, HS * 2]
Any ideas?