I have a SparkSQL connection to an external database:
from pyspark.sql import SparkSession spark = SparkSession \ .builder \ .appName("Python Spark SQL basic example") \ .getOrCreate()
If I know the name of the table, it is easy to query.
users_df = spark \ .read.format("jdbc") \ .options(dbtable="users", **db_config) \ .load()
But is there a good way to list / open tables?
I want the equivalent of SHOW TABLES in mysql or \dt in postgres.
I use pyspark v2.1 if that matters.
source share