I am experimenting with the fact that my Python code is part of the standard directory structure used to deploy with setup.py and possibly PyPI. for a Python library called mylib, it would be something like this:
mylibsrc/ README.rst setup.py bin/ some_script.py mylib/ __init.py__ foo.py
Often there is also a subdirectory test/ , but I have not tried writing unit tests yet. The recommendation for scripting in the bin/ subdirectory can be found in the official Python packaging documentation.
Of course, scripts start with code that looks like this:
#!/usr/bin/env python from mylib.foo import something something("bar")
This works well when you end up deploying a script (e.g. devpi) and then installing it with pip. But if I ran the script directly from the source directory, as I would get this error when developing new changes in the / script library:
ImportError: No module named 'mylib'
This is true even if the current working directory is the root of mylibsrc/ , and I ran the script by typing ./bin/some_script.py . This is because Python begins to search for packages in the directory of the script to be started (i.e., from bin/ ), and not in the current working directory.
What is a good way to run scripts easily when developing packages?
Here is another relevant question (especially comments on the first answer).
The solutions for this that I have found so far fall into three categories, but none of them are perfect:
- Manually pin your Python module search path before running scripts.
- You can manually add
mylibsrc to my PYTHONPATH environment variable. This is apparently the most official (Pythonic?) Solution, but it means that every time I check a project, I have to remember to manually change my environment before I can run any code in it. - Add
. to the beginning of my PYTHONPATH environment variable. As far as I understand, this may have some security issues. It would be my favorite trick if I were the only person who could use my code, but I do not, and I do not want to ask others to do it. - When browsing the Internet for files in the
test/ directory, I saw recommendations that they all (indirectly) include a line of code sys.path.insert(0, os.path.abspath('..')) (for example, in structuring your project ). Ugh! This is similar to portable hacking for files that are intended only for testing, but not for those that will be installed with the package. - Edit: since then I have found an alternative that ends up in this category: by running scripts with Python
-m script, the search path starts in the working directory, and not in the bin/ directory. See my answer below for more details.
- Install the package in a virtual environment before using it using setup.py (either run it directly or use pip).
- This seems redundant if I just test a change that I am not sure, even syntactically correct. Some of the projects I'm working on are not even intended to be installed as packages, but I want to use the same directory structure for everything, and that will mean a setup.py entry so I can test them!
- Edit: Two interesting options for this are discussed in the answers below:
setup.py develop command in logc answer and pip install -e in mine. They avoid the need to βinstallβ for each small edit, but you still need to create setup.py for packages that you never intend to install completely, and doesn't work very well with PyCharm (which has a menu entry to run the develop command, but not an easy way to run scripts that it copies to a virtual environment).
- Move the scripts to the project root directory (i.e. in
mylibsrc/ instead of mylibsrc/bin/ ).- Ugh! This is a last resort
, but unfortunately it seems like the only option available at the moment .
source share