Tf.matmul gives different results for two matrices A and B, where B is just A with additional rows

import tensorflow as tf
from tensorflow.keras.layers import Embedding

# Set up
embedding = Embedding(36711,512,mask_zero=True)
example_sequence = tf.constant([36710,  5095,   466, 16678,     5,     3,  5152, 36711] + [0]*(64-8))

pad_embed    = example_sequence # input with padding       # has shape (64,)
no_pad_embed = example_sequence[:8] # input without padding # has shape (8,)

no_pad_embed = embedding(no_pad_embed) # shape (8,512)  # A
pad_embed    = embedding(pad_embed)    # shape (64,512) # B : has same first 8 rows as A


# Real problem: I want to use matmul to do A @ A.T and B @ B.T
pad_out    = tf.matmul(pad_embed,pad_embed, transpose_b=True)
no_pad_out = tf.matmul(no_pad_embed,no_pad_embed, transpose_b=True)
#print(pad_out[:8,:8])
#print(no_pad_out)

print(tf.reduce_all(pad_out[:8,:8]==no_pad_out)) # False, meaning that the upper left 8 by 8 submatrix of pad_out is not equal to no_pad_out




I have pasted the above.
Description:

I want to do matmul(A,transpose(A)) and matmul(B,transpose(B)). Here, B is just matrix A with additional rows. A is 8 by 512 and b is 64 by 512 with first 8 rows exactly as that of A.

Explanation:
let’s call
A_out = A matmul transpose(A), this is 8 by 8,
B_out = B matmul transpose(B), this is 64 by 64.

B_out’s top left 8 by 8 matrix should equal A_out,

Additionally, numpy gives the expected result:

import numpy as np

# Real problem: I want to use matmul to do A @ A.T and B @ B.T
pad_embed   = pad_embed.numpy()
no_pad_embed= no_pad_embed.numpy()

pad_out    = np.matmul(pad_embed,pad_embed.T)
no_pad_out = np.matmul(no_pad_embed,no_pad_embed.T)
#print(pad_out[:8,:8])
#print(no_pad_out)

print(tf.reduce_all(pad_out[:8,:8]==no_pad_out)) # True, meaning 8 by 8 submatrix of pad_out equals no_pad_out

Hi @Syed_Hamza_Mohiuddin,Thanks for reporting the issue. While reproducing the issue we have observed the difference in results with Tensorflow and numpy. we will look into it and get back to you. Thank You.