最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

google cloud platform - Schedule weekly data import to AlloyDB with CSV stored in GCS bucket - Stack Overflow

programmeradmin1浏览0评论

We have a requirement where we have CSV files copied by an application on every week into GCS bucket. We want to import this CSV data to AlloyDB Postgres database. This process should run whenever the application team drops the file in GCS bucket. How can i achieve this by adhering to best practices in GCp?

We have a requirement where we have CSV files copied by an application on every week into GCS bucket. We want to import this CSV data to AlloyDB Postgres database. This process should run whenever the application team drops the file in GCS bucket. How can i achieve this by adhering to best practices in GCp?

Share Improve this question asked Feb 5 at 17:52 CoolbreezeCoolbreeze 9472 gold badges16 silver badges35 bronze badges
Add a comment  | 

1 Answer 1

Reset to default 0

I’d suggest automating the process using a Cloud Run function that is triggered whenever a new CSV file is dropped into the GCS bucket. You can use a Python or Node.js script to read the files and transform them. Afterwards, load the data using the appropriate driver or connector. You can read into this article regarding connecting to AlloyDB for PostgreSQL.

If you want to focus more on scheduling the process, consider using Cloud Scheduler. Here’s the documentation for scheduling a cloud run function.

发布评论

评论列表(0)

  1. 暂无评论