最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

python - Pandas DataFrame.to_sql randomly and silently failes without error message - Stack Overflow

programmeradmin1浏览0评论

I am trying to write several pandas dataframes (of the same format) to a postgres db, however some of them randomly are not written to the db. In that case to_sql silently fails and returns -1 (indicating the failure).

I don't use any schema which should rule out this issue as a possible cause, and I am not using SQL Server either. What totally strikes me is the fact that some of these dataframes are written to the db and some are not.

code:

from sqlalchemy import create_engine, inspect, DateTime
import psycopg
engine = create_engine('postgresql+psycopg://plantwatch:[email protected]/plantwatch') 
df.to_sql('power', con=engine2, if_exists='append', index=False, dtype={'produced_at': DateTime})

example df (for each dataframe one id is written to the db) and expected db content :

produced_at             id          value
2015-01-01 00:00:00     someid      1
2015-01-01 01:00:00     someid      2
2015-01-01 00:00:00     someid2     1
2015-01-01 01:00:00     someid2     2

actual db content:

produced_at             id          value
2015-01-01 00:00:00     someid      1
2015-01-01 01:00:00     someid      2

A whacky workaround would be to dump all dataframes to .csv files and import each of them one by one to postgres, but there has to be another way.

发布评论

评论列表(0)

  1. 暂无评论