最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

databricks - Missing Delta-live-Table in hive-metastore catalog - Stack Overflow

programmeradmin0浏览0评论

I defined my delta table in an external location as following:

%sql
CREATE OR REFRESH  STREAMING TABLE pumpdata (
Body string,
EnqueuedTimeUtc string,
SystemProperties string,
_rescued_data string,
Properties string
)
USING DELTA
LOCATION 'abfss://[email protected]/Bronze/pumpdata'

I have a delta live table pipeline with theses settings:

As you can see, I have defined the same external location and set hive Metastore as storage option

and this definition:

import dlt
from pyspark.sql.functions import col
json_path = f"abfss://[email protected]/XXXX/*/*/*/*/*.JSON"
@dlt.create_table(
  name="pumpdata",
   table_properties={
        "quality": "raw"
        },
  comment="Data ingested from an ADLS2 storage account."
)
def pumpdata():
  return (
    spark.readStream.format("cloudFiles")
      .option("cloudFiles.format", "JSON")
      .load(json_path)
)

I can successfully run my DLT and parquet files are put in the storage account, but in the catalog under hive-meta store, I cannot see my table

here is my catalog:

Where can I find my Delta live table?

发布评论

评论列表(0)

  1. 暂无评论