I have a large data set (185 GB) on which I plan to execute some machine learning algorithms. Data is located on a local computer with limited computational authority. I have access to a remote cluster where I can execute my expensive algorithms. It has 1 TB of memory and is pretty fast. But for some reason, I only have 2 GB (!) Of disk storage on the remote server.
I can connect to the cluster via SSH, is there any way in python so that I can load the database into RAM via SSH?
Any general tips on how to solve this problem are greatly appreciated.
source
share