I have the following structure in a Python program:
my_program/ main.py packages/ __init.py__ package_to_share/ __init__.py main_to_share.py module_to_share.py package_A/ __init__.py main_A.py some_module_A.py package_B/ __init__.py main_B.py some_module_B.py
The package_to_share package provides the functionality that each package in the packages folder uses and what main.py in the root folder uses.
I also want to be able to cd in every package and be able to run main_X.py .
So far, I've been figuring out how to access functions from main.py :
import packages.package_A.some_module_A import packages.package_to_share.module_to_share
but I am having problems accessing functions in package_to_share from regular packages (e.g. package_A )
For example, if you cannot enter import packages.package_to_share.module_to_share in main_A.py or some_module_A.py .
This leads me to the following questions:
Given the specifics of my problem, with packages that will be available (accessible) by files in the root folder and other packages, is there a better way to organize my folders? Does this organization of modules and files generally comply with good standards in Python?
The following looks unbelievably hacked to me, but this is the best I've come up with to make sure my regular modules see the "shared" modules:
p_this_file = os.path.dirname(os.path.realpath(__file__)) new_sys_path.append(os.path.join(p_cwd, '..') new_sys_path.extend(sys.path) sys.path = new_sys_path
It also does not prevent my regular packages from importing each other.
source share