We have a requirement where we have CSV files copied by an application on every week into GCS bucket. We want to import this CSV data to AlloyDB Postgres database. This process should run whenever the application team drops the file in GCS bucket. How can i achieve this by adhering to best practices in GCp?
We have a requirement where we have CSV files copied by an application on every week into GCS bucket. We want to import this CSV data to AlloyDB Postgres database. This process should run whenever the application team drops the file in GCS bucket. How can i achieve this by adhering to best practices in GCp?
Share Improve this question asked Feb 5 at 17:52 CoolbreezeCoolbreeze 9472 gold badges16 silver badges35 bronze badges1 Answer
Reset to default 0I’d suggest automating the process using a Cloud Run function that is triggered whenever a new CSV file is dropped into the GCS bucket. You can use a Python or Node.js script to read the files and transform them. Afterwards, load the data using the appropriate driver or connector. You can read into this article regarding connecting to AlloyDB for PostgreSQL.
If you want to focus more on scheduling the process, consider using Cloud Scheduler. Here’s the documentation for scheduling a cloud run function.