Tensorflow on Docker: How to save work on Jupyter laptop?

Beginner and Docker, and Tensorflow, and try them. Installation (on win10 using the hyper-v driver) went fine and I can run

docker run -p 8888:8888 -it gcr.io/tensorflow/tensorflow 

and get the output as follows:

 [I 23:01:01.188 NotebookApp]←(B Serving notebooks from local directory: /notebooks [I 23:01:01.189 NotebookApp]←(B 0 active kernels [I 23:01:01.189 NotebookApp]←(B The Jupyter Notebook is running at: http://[all ip addresses on your system]:8888/ [I 23:01:01.189 NotebookApp]←(B Use Control-C to stop this server and shut down all kernels (twice to skip confirmation). 

and I can open the Jupyter laptop from a browser by opening [docker host address]: 8888.

However, after doing some work (for example, creating a new laptop), when I stop the Ctrl-C server twice, all new work is lost. Maybe I missed something basic, so let me say that I'm not sure here:

  • Shouldn't I stop the server?
  • I use the same docker run command when restarting. It is right?

Thank you for your help.

+5
source share
3 answers

You want to run the container as a daemon. Then you can docker stop and docker start container and get your work.

docker run -td -p 8888:8888 gcr.io/tensorflow/

Working with -it makes the container interactive and works in the foreground, so the work is lost when it is canceled. Best practice is to run it as a daemon so that you don't have CTRL + C to exit and instead let the docker manage the state.

+6
source

I run Docker as a named container:

 $ docker run -p 8888:8888 -d --name appu b.gcr.io/tensorflow-udacity/assignments 

'appu' is the name I gave to my container. -p redirects port number 8888 from Linux to Windows. -d forces the program to run in the background, so you get the $ prompt on your console and can continue to work with other tasks (this is what is called “demonization,” but don’t intimidate geeks. “please run quietly in the background and return me console "!) If you want to stop the container, specify it by name

 $ docker stop appu 

The next time you want to return the same container with all the files created in the previous session, run the appu container again:

 $ docker start appu 
+3
source

You can mount the current host folder to replace the standard /notebooks folder in the container. Here is an example:

 $ docker run -p 8888:8888 -v `pwd`:/notebooks -it gcr.io/tensorflow/tensorflow [I 02:34:49.393 NotebookApp] Writing notebook server cookie secret to /root/.local/share/jupyter/runtime/notebook_cookie_secret [W 02:34:49.411 NotebookApp] WARNING: The notebook server is listening on all IP addresses and not using encryption. This is not recommended. [I 02:34:49.420 NotebookApp] Serving notebooks from local directory: /notebooks [I 02:34:49.421 NotebookApp] 0 active kernels [I 02:34:49.421 NotebookApp] The Jupyter Notebook is running at: http://[all ip addresses on your system]:8888/?token=b9da5de7f61d6a968dc07e55c6157606a4f2f378cd764a91 [I 02:34:49.421 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation). [C 02:34:49.422 NotebookApp] Copy/paste this URL into your browser when you connect for the first time, to login with a token: http://localhost:8888/?token=b9da5de7f61d6a968dc07e55c6157606a4f2f378cd764a91 
+1
source

Source: https://habr.com/ru/post/1247562/


All Articles