Organization of Python projects with shared packages

What is the best way to organize and develop a project consisting of many small scripts sharing one (or more) Python libraries?

Our repository has many programs that use the same libraries stored in the same repository. In other words, a layout like

trunk
    libs
        python
        utilities
    projects
        projA
        projB

When the official runs of our programs are performed, we want to record which version of the code was used. Everything is simple for our C ++ executables, because as long as the working copy is clean at compile time, everything is fine. (And since we get the version number programmatically, it should be a working copy, not an export.) For Python scripts, things get more complicated.

The problem is that often one project is executed (for example, projA), and projB will need to be updated. This may result in a revision of the working copy being mixed with projA at runtime. (It takes several hours to execute the code, and it can be used as input to processes that take several days to complete, hence the goal of tracking is important.)

My current workaround, if necessary, is to check another copy of the torso elsewhere and run away from there. But then I need to remember to change PYTHONPATH to point to the second version of lib / python, and not what was in the first tree.

There is unlikely to be a perfect answer. But there must be a better way.

subversion , ? virtualenv? ? Setuptools , , , , , ( ).

+3
2

.

externals .

, , , .

: ( ) virtualenv, . , .

+2

, virtualenv. - , .

+1

Source: https://habr.com/ru/post/1722181/


All Articles