I have an Akka system written in scala that should call Python code based on Pandas and Numpy , so I can't just use Jython. I noticed that Spark uses CPython on its work nodes, so I'm curious how it executes Python code and whether this code exists in some reusable form.
pandas scala interop apache-spark pyspark
Arne Claassen Jun 06 '15 at 16:18 2015-06-06 16:18
source share