Run a complex sequence of commands in a single Docker container

I am trying to automate the following cycle using Docker: create a container, do some work inside it (more than one separate command), get some data from the container.

Something along the lines of:

for ( i = 0; i < 10; i++ ) spawn a container wget revision-i do something with it and store results in results.txt 

According to the documentation, I have to go with:

 for ( ... ) docker run <image> <long; list; of; instructions; separated; by; semicolon> 

Unfortunately, this approach is not attractive and cannot be redirected, since the list of instructions is growing in complexity.

The wrapper of instructions in the script, as in docker run <image> /bin/bash script.sh , does not work, since I want to create a new container for each iteration of the loop.

Summarizing:

  • Is there any reasonable way to run a complex series as described above in a single container?

  • As soon as some data is stored inside the container, for example, /home/results.txt, and the container returns, how can I get result.txt? The only way I can think of is to transfer the container and tar file from a new image. Is there a more efficient way to do this?

Bonus: should I use vanilla LXC? I have no experience, although I'm not sure. Thanks.

+4
source share
3 answers

In the end, I came up with a solution that works for me and greatly improved my Docker experience.

In short . I used a combination of Fabric and a container running sshd.

Details:

The idea is to create container (s) using sshd using the Fabric local network and running commands in containers using the Fabric run.

To give an example (Python), you could have a Container class with:

1) a method for locally starting a new container with starting and starting sshd, for example.

 local('docker run -d -p 22 your/image /usr/sbin/sshd -D') 

2) set the env parameters necessary for Fabric to connect to a running container. Fabric to learn more about it

3) write your methods to run everything you want in a container using a Fabric run, for example.

 run('uname -on') 

Oh, and if you like Ruby, you can do the same with Capistrano .

Thanks @qkrijger (+ 1'd) for putting me on the right path :)

+3
source

In question 2.

I do not know if this is the best way, but you can set SSH on the image and use it. For more information about this, you can check this page from the documentation.

+1
source

You send 2 questions in one. Perhaps you should put 2. in another position. I will review here 1.

I don’t understand if you want to create a new container for each iteration (as you say first), or if you want to "run a complex series of commands, as described above inside the same container"? as you say later.

If you want to create multiple containers, I expect there will be a script on your computer. If you need to pass an argument to your container (e.g. i ): work is currently underway on passing arguments. See https://github.com/dotcloud/docker/pull/1015 (and https://github.com/dotcloud/docker/pull/1015/files for changing documentation that is not yet connected to the network).

0
source

Source: https://habr.com/ru/post/1488986/


All Articles