Installing custom libraries or deploy a container

Hi everyone,

I am quite new in the world of cloud computing that this might be an extremely silly question. I would like to run some GPU-simulations using the openmm environment. For that, I found that it is possible to install the required libraries as is google colab. See the code here:

    print(sys.version)
    !wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
    !bash Miniconda3-latest-Linux-x86_64.sh -bfp /usr/local
    !conda config --set always_yes yes
    !conda config --add channels omnia
    !conda config --add channels conda-forge
    !conda create -n openmm python=3.6 git rdkit openbabel openmm mdtraj nglview pymbar pdbfixer openmmforcefields openforcefields openmoltools parmed
    sys.path.append('/usr/local/envs/openmm/lib/python3.6/site-packages') # Of course I had to change this path to the appropiate one
    # install the openforcefield package directly from github.
    !pip install git+https://github.com/openforcefield/[email protected]
    import simtk.testInstallation
    simtk.testInstallation.main()

And then I need to run

!conda install ambertools

However, I have noticed that the libraries stay there only as far as the notebook is running. Thus, I was wondering if there is a method to create a notebook that already contains the libraries I need for the job from the start.

Thank you for your time.

Hi @phisanti If you use Gradient Notebooks instead of Colab, you only need to install the libraries once and they will persist across sessions :slight_smile: