Access files on local storage via Jupyter Notebook on Paperspace

Need to access quite a few files via my Jupyter Notebook code on paperspace, all of these files are stored on my local machine. I get a FileNotFoundError. Need to access about 3000 images. Any convenient methods to upload a large number of large files?

img = cv2.imread(r"C:\\Users\\Pictures\\Dataset\\DSC_0258.JPG")
print(img)
img.shape

None
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-7-25c8cf1433b1> in <module>
      3 img = cv2.imread(r"C:\\Users\\Pictures\\Dataset\\DSC_0258.JPG")
      4 print(img)
----> 5 img.shape

AttributeError: 'NoneType' object has no attribute 'shape'

Hi @Vedant_Modi the easiest way is probably to zip up the files and hit the upload button with the Gradient Notebook. Then you can unzip them and you should be all set to go.

how do i unzip it?

Hi there, you can try the following to unzip a file:

import zipfile
with zipfile.ZipFile(“file.zip”,“r”) as zip_ref:
zip_ref.extractall(“targetdir”)

If that doesn’t work, let me know.

Can I unzip it to a versioned dataset?

Hey, I’ll investigate this for you and let you know. :slight_smile:

Currently, Datasets don’t support the Zip extension. You could run a workflow to expand it to a dataset. Also, you may upload a local folder using the CLI.

I believe you can use these commands:

gradient datasets files put [OPTIONS]

Options

–id <dataset_version_id>

Required Dataset version ID (ex: dsr8k5qzn401lb5:klfoyy9)

–source-path <source_paths>

Required File or directory to put

–target-path <target_path>

Target dataset file path

–apiKey <api_key>

API key to use this time only

–optionsFile <options_file>

Path to YAML file with predefined options

–createOptionsFile

Generate template options file

More commands can be found on: https://paperspace.github.io/gradient-cli/gradient.cli.html#gradient-datasets.

Thanks for the solution, but I don’t care anymore. This is too slow for me. I let the upload script run for 12h, it still not finished.
Replace scp with a terminal, what a genius design. I’d delightly throw sub fee away.

facing the same prob with extremely slow upload rates. care to explain what was your solution?