最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

Deploying a Kedro project to cloud build with pytest accessing to google cloud storage - Stack Overflow

programmeradmin3浏览0评论

I am trying to deploy my Kedro instance in Cloud Run which my cloudbuild.yml look like this :

steps:
- name: "noobzik/uv-gcp-cloud-build"
  id: CI
  entrypoint: /bin/bash
  env:
    - PROJECT_ID=$PROJECT_ID
    - SERVICE_ACCOUNT=$_SERVICE_ACCOUNT
  args:
  - -c
  - |
    echo "$SERVICE_ACCOUNT" | base64 -d > service_account.json
    gcloud auth activate-service-account --key-file=service_account.json
    gcloud config set project "$PROJECT_ID"
    chmod a+x install.sh && 
    ./install.sh &&
    source .venv/bin/activate &&
    pytest .
#- name: "noobzik/uv-gcp-cloud-build"
#  id: CD
#  entrypoint: /bin/bash
#  args:
#  - -c
#  - 'chmod a+x install.sh && ./install.sh && kedro run --pipeline global'
#  env:
#  - 'ENV=$BRANCH_NAME'
#  - 'MLFLOW_SERVER=$_MLFLOW_SERVER'

logs_bucket: gs://purchase_predict

The reason I use my own docker image is to use Astral UV to speed up the deployment regarding the pip requirement installs.

The issue is Pytest can't pass the unit test at my node named

tests/pipelines/loading/test_pipeline.py F                               [ 40%]

Which it basically grab a spark generated csv folder named primary.csv located at gs://purchase_predict/primary.csv The reason it can't pass is because my build is not authenticated by Google Cloud Build. So I tried passing the json key as a Substitution variables (both encode as 64 and plain) into _SERVICE_ACCOUNT but it's not working

base64: invalid input
ERROR: (gcloud.auth.activate-service-account) Could not read json file service_account.json: Expecting value: line 1 column 1 (char 0)

I am starting to run out of solution. For some reason I don't know, I gave roles access to my service account to access gcs

Any help is appreciated

发布评论

评论列表(0)

  1. 暂无评论