I defined my delta table in an external location as following:
%sql
CREATE OR REFRESH STREAMING TABLE pumpdata (
Body string,
EnqueuedTimeUtc string,
SystemProperties string,
_rescued_data string,
Properties string
)
USING DELTA
LOCATION 'abfss://[email protected]/Bronze/pumpdata'
I have a delta live table pipeline with theses settings:
As you can see, I have defined the same external location and set hive Metastore as storage option
and this definition:
import dlt
from pyspark.sql.functions import col
json_path = f"abfss://[email protected]/XXXX/*/*/*/*/*.JSON"
@dlt.create_table(
name="pumpdata",
table_properties={
"quality": "raw"
},
comment="Data ingested from an ADLS2 storage account."
)
def pumpdata():
return (
spark.readStream.format("cloudFiles")
.option("cloudFiles.format", "JSON")
.load(json_path)
)
I can successfully run my DLT and parquet files are put in the storage account, but in the catalog under hive-meta store, I cannot see my table
here is my catalog:
Where can I find my Delta live table?