When attempting to create a table in databricks
CREATE OR REPLACE TABLE foo.bar (
comment VARCHAR(255),
row_count INT,
date TIMESTAMP
)
Returns the following error
[DELTA_CREATE_TABLE_WITH_NON_EMPTY_LOCATION] Cannot create table ('
spark_catalog
.foo
.bar
'). The associated location ('dbfs:/user/hive/warehouse/foo/bar') is not empty and also not a Delta table. SQLSTATE: 42601
Expected it to create a table as the CREATE OR REPLACE should do that based on advice I've read
When attempting to create a table in databricks
CREATE OR REPLACE TABLE foo.bar (
comment VARCHAR(255),
row_count INT,
date TIMESTAMP
)
Returns the following error
[DELTA_CREATE_TABLE_WITH_NON_EMPTY_LOCATION] Cannot create table ('
spark_catalog
.foo
.bar
'). The associated location ('dbfs:/user/hive/warehouse/foo/bar') is not empty and also not a Delta table. SQLSTATE: 42601
Expected it to create a table as the CREATE OR REPLACE should do that based on advice I've read
Share Improve this question edited Feb 3 at 11:27 Joe Zalewski asked Feb 3 at 10:24 Joe ZalewskiJoe Zalewski 3510 bronze badges 2- Brutal scoring on this one - feel free to tell me what is incorrect / poorly formatted instead of downvoting please! – Joe Zalewski Commented Feb 4 at 10:13
- This is expected if the mentioned location previously contained a non-Delta external table. Most likely, the data is in a different format (e.g., Parquet, ORC, or CSV). You can drop the existing data at the location and then recreate the table as a Delta table – BruceWayne Commented Feb 13 at 17:24
1 Answer
Reset to default 1I had similar problems in the past and it turned out that the table was deleted in my schema, but some metadata was left behind in my dbfs. So the table is trying to fill the location in your dbfs, but sees the location is already taken.
I recommend you to have a look in your dbfs and if this is the case, try the dbutils.fs.rm(...)
command to empty the location. After that, you can try again and hopefully it will work.