Managing many Python / virtualenvs projects

At my workplace, I have to manage many (currently dozens, but possibly hundreds) of Python web applications that potentially use various frameworks, libraries, etc. (all in different versions). Virtualenv has been a lifelong leader in managing this ever since, but I would like to be able to better manage it, especially when it comes to managing package updates.

I thought of several scenarios

Option 1: Install all the necessary modules for each project in each virtual space using pip, update them individually each time. This will require significant time overhead for each update and will require additional documentation to track things. May be facilitated with some control scripts.

Option 2: Install all the libraries used by any application in the central repository, use symbolic links to easily change versions once for all projects. Ease of updating and centralized management, but primarily use virtualenv.

Option 3: Hybrid the above two somehow, centralizing the most common libraries and / or those that probably need updates, and installing the rest locally for each virtualenv.

Does anyone have a similar situation? What is the best way to handle this?

+6
source share
2 answers

You can use zc.buildout. This is more annoying than just pip / virtualenv, but it gives you more automation options. If using disk space is not a problem, I would suggest that you simply use separate environments for each project so that you can update them one at a time.

+2
source

We have a requirements.pip file in our project root that contains installation packages for installation, so updating automatically is relatively simple. I'm not sure symlinking will solve the problem - it will make it difficult to make updates to a subset of your projects. If disk space is not a problem and you can write some simple scripts to display and update packages, I would stick with virtualenv as-is.

0
source

Source: https://habr.com/ru/post/889968/


All Articles