OOM when allocating tensor with shape

i tried to update the weighs layer of embedding in the keras model but this size
(1393860, 768)
by

 se1 = Embedding(1393860, 768, mask_zero=True)
 model.layers[1].set_weights([embedding_matrix])    
 model.layers[1].trainable = False

but got
tensorflow.python.framework.errors_impl.ResourceExhaustedError: OOM when allocating tensor with shape[1393860,768] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc [Op:RandomUniform]

what is your gpu ? I was getting this error while i give huge train data size to GPU

i’m using rtx 2060 with 32 GB RAM

can you try this value with little value that suitable for the model ?

excuse me I didn’t get what do you mean? the model is working with values that have fewer numbers than those as I’m using Bert to get word-embedding for 150000 sentences. when i used 150000 sentences it worked but now I need to use the words themselves that have 13 million and 93860 words but got a problem with GPU.

I guess your gpu memory is not enough this data size. But I dont know if there is a way. I would tried for train data Batch size and reduced 64 batch to 32 batch.

i tried to reduce batsh_size but error still

Hello, I’ve encountered the same issue. Have you resolved it? If you have, could you please let me know how to fix it?