Launching pyspark sockets

How to run unit tests with nose for Apache Spark Python applications?

With nose , you can usually just call a command

 nosetests 

to run tests in the tests directory of the Python package. Pyspark scripts must be run using the spark-submit command instead of the regular Python executable in order to enable the import of the pyspark module. How to combine nosetests with pyspark to run tests for my Spark application?

+5
source share
1 answer

If this helps, we use nosetest to test sparkling pandas . We did a bit of work in our utils file to add pyspark to a path based on the SPARK_HOME shell environment variable.

+4
source

Source: https://habr.com/ru/post/1205932/


All Articles