Issue facing in multihead attention

Hi I am facing some issues with tensorflow multihead attention.

Even though the input shape is consistent. It’s still throwing an error saying incorrect input dimension.

Can any one help please.

I am creating a local transformer that also single layer

Hi @Satish_Hiremath Welcome to the Tensorflow Forum ,

Could you provide more details about your implementation and the specific error message you’re encountering? This will help me understand the issue better and assist you more effectively.

Thank You !