How to run unit tests with nose for Apache Spark Python applications?
With nose , you can usually just call a command
nosetests
to run tests in the tests directory of the Python package. Pyspark scripts must be run using the spark-submit command instead of the regular Python executable in order to enable the import of the pyspark module. How to combine nosetests with pyspark to run tests for my Spark application?
source share