Error when creating custom container notebook - Jupyter token not found


#1

I’m trying to create a notebook via a custom container. I’m using Pytorch but I prefer to use Jupyter Lab and I need some other libraries for training.

I created a container and am hosting it publicly on Docker Hub. When creating the notebook I receive an error saying that a Jupyter token wasn’t found. I tried using a startup command in the dockerfile and also leaving it out in lieu of the PaperSpace option in the notebook creator.

I’ve tested the Jupyter install on this container and I know that it is working as expected and outputting a token. I’m not sure how PaperSpace is grabbing the token though.

I’ve pasted the Dockerfile below.

FROM pytorch/pytorch:1.1.0-cuda10.0-cudnn7.5-devel

WORKDIR /workspace/

# install basics
RUN apt-get update -y
RUN apt-get install -y git curl ca-certificates bzip2 cmake tree htop bmon iotop sox libsox-dev libsox-fmt-all vim

# install python deps
RUN pip install cython visdom cffi tensorboardX wget

# install warp-CTC
ENV CUDA_HOME=/usr/local/cuda
RUN git clone https://github.com/SeanNaren/warp-ctc.git
RUN cd warp-ctc; mkdir build; cd build; cmake ..; make
RUN cd warp-ctc; cd pytorch_binding; python setup.py install

# install pytorch audio
RUN git clone https://github.com/pytorch/audio.git
RUN cd audio; python setup.py install

# install ctcdecode
RUN git clone --recursive https://github.com/parlance/ctcdecode.git
RUN cd ctcdecode; pip install .

ENV LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH

# install apex
RUN git clone --recursive https://github.com/NVIDIA/apex.git
RUN cd apex; pip install .

# install deepspeech.pytorch
ADD . /workspace/deepspeech.pytorch
RUN cd deepspeech.pytorch; pip install -r requirements.txt

# launch jupyter
RUN pip install jupyterlab
RUN mkdir data; mkdir notebooks
CMD jupyter-lab --ip="*" --no-browser --allow-root

#2

@robertritz Thank you writing this is in. This was indeed a bug, which has since been fixed/resolved. Please reach out to [email protected] if you still run into any issues.