"I’m quite new to this, and currently, I’m experimenting with models I found online using some self-generated data. Initially, I had a model that performed well with tensor shapes of around 576x90, providing relatively good results. Recently, I increased the size of my data to around 3000x90, but I’m encountering out-of-memory errors.
My original tensors were processed in batches of about 256, but now, even running them in a batch of 1 is causing memory issues. I’ve upgraded to a more powerful machine with 45GB of RAM and 80GB of VRAM, but it’s still running out of memory. I had assumed that much larger tensors were commonly used without these issues."