Issue
I am an astronomy student working with large data set. I have 80 TB of .fits file on a supercomputer that I am trying to process using python script. I could process the data stored on the supercomputer by submitting a job(which stays in queue for ages) or I could process the data in my local desktop(without downloading all the data). However, I cannot download all(80TB) data to my local desktop due to storage issue. I was wondering if there is a way to run the processing python script on my local desktop but it reads data from the supercomputer using secure shell.
Thanks.
Solution
Check out Perform commands over ssh with Python and Download files over SSH using Python How to list all the folders and files in the directory after connecting through sftp in python
You could put it in a loop and get a file > Parse it > get next file and so on.
Answered By - tryonlinux Answer Checked By - Robin (WPSolving Admin)