At my workplace, I have to manage many (currently dozens, but possibly hundreds) of Python web applications that potentially use various frameworks, libraries, etc. (all in different versions). Virtualenv has been a lifelong leader in managing this ever since, but I would like to be able to better manage it, especially when it comes to managing package updates.
I thought of several scenarios
Option 1: Install all the necessary modules for each project in each virtual space using pip, update them individually each time. This will require significant time overhead for each update and will require additional documentation to track things. May be facilitated with some control scripts.
Option 2: Install all the libraries used by any application in the central repository, use symbolic links to easily change versions once for all projects. Ease of updating and centralized management, but primarily use virtualenv.
Option 3: Hybrid the above two somehow, centralizing the most common libraries and / or those that probably need updates, and installing the rest locally for each virtualenv.
Does anyone have a similar situation? What is the best way to handle this?
source share