Tutorial: How to use CUDA with VScode

So, since it took me a good while to find about how to do this, I thought it could be good to write it as a tutorial. In this way, at least others could benefit from my long hours of struggle with docker.
The basic idea of this tutorial is to run a Gradient Notebook using the VScode server IDE as a template and at the same time being able to take advantage of the incredible GPU power available here.

TL;DR
The short version is simple, if you just want to run Ubuntu 20 + CUDA in VScode server, then when starting your new notebook, skip the “pre-defined” image part, but choose your machine and click on “advance options”. There, in the section container name write phisanti/gradient-coder:latest and on the entry point write /run.sh.

NOTE: in order to use the python, it is necessary to install the version v2020.10.332292344 and uninstall the jupyter version. More on this bug here: Python extension is broken · Issue #2341 · cdr/code-server (github.com, sorry I cannot put more than 2 links per post!).

Full version
If, however, you prefer to write a more customised version, then you can get the dockerfile from phisanti/gradient-code-server: Dockerfile to create the image for a gradient notebook running on code-server with CUDA 10 (github.com). There, you can modify the instructions to build the code-server image. Once you have written the code-server image, you can call it from gradient by forking and pushing this dockerfile Paperspace/gradient-coder: Run VSCode as a Gradient Notebook! (https:// github .com /Paperspace/gradient-coder). There, the only change that you need is to do is to modify the FROM line to point towards your newly designed image!

Once you pushed the image of the notebook, start a new notebook, put your container in the advance option “container” section, write /run.sh as the entry point and voila!

I hope this can be of some use to anyone. To me, a biologist whose only programming experience is R and python took me a good while, so I hope it saves you some time!

Cheers,
S