You are very close! However, the way you ask this question is a kind of anti-pattern. You do not want to share task data in Airflow. In addition, you do not want to use a statement, as in mysql_operator_test . It is tempting, I did the same when I started.
I tried something very similar to this, but with SFTP connections. In the end, I just did everything inside PythonOperator and used basic hooks.
I would recommend using MySQLHook inside python_callable . Something like that:
def count_mysql_and_then_use_the_count(): """ Returns an SFTP connection created using the SSHHook """ mysql_hook = MySQLHook(...) cur = conn.cursor() cur.execute("""SELECT count(*) from table 1 where id>100""") for count in cur:
I'm not sure if this will work, but the idea is that you use a hook inside your Python, I often use MySQLHook , but I did it with SSHHook and it works great.
source share