Hi good people from Paperspace,
I was training some models with free GPUs using Jupyter notebooks to understand if this is a service I want to pay for and use.
It worked fine, however, at one point my GPU memory started to immediately spike up to 100% for the easiest and smallest tasks. Hence, I can not use any of the training capabilities any more.
Shutting down the kernel resets the GPU memory, but as soon as I run a script it spikes up to 100% again and I receive following error:
RuntimeError: CUDA out of memory. Tried to allocate 128.00 MiB (GPU 0; 15.90 GiB total capacity; 15.11 GiB already allocated; 91.50 MiB free; 15.17 GiB reserved in total by PyTorch)
Is this something that happens quite often? And how to resolve it?
PS - paperspace does not accept german credit cards it seems. Tried a couple of times to pay for an account and was denied every time.