最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

terraform - Run bash script in TF and upload to S3 in Gitlab pipelines - Stack Overflow

programmeradmin1浏览0评论

I am using a gitlab pipeline which uses terraform for deploying AWS resources. The pipeline configuration is pre-existing, hence i cannot change anything there. The pipeline run needs to generate some files using a shell script using data external and uses the generated files to upload to s3 using the resource aws_s3_object. If I run this locally on my machine then it works but does not work on the pipeline.

data "external" "proto_generator" {
  program = [
    "sh", "-c",
    <<-EOT
         echo "=== Starting Proto Generation ===" >&2

        # Redirect all installation output to stderr
        apk add --no-cache bash jq protobuf protobuf-dev >&2

        # Show current directory and contents
        pwd >&2
        ls -la ${path.root}/libraries/proto_files >&2

        # Execute proto generator script and redirect its output to stderr
        bash ${path.root}/scripts/proto_generator.sh >&2

        # List generated files for debugging
        echo "Generated files:" >&2
        ls -la ${path.root}/libraries/proto_files/*/*_descriptor.desc >&2

        # Only output the JSON object, nothing else
        printf '{"result":"success"}'
    EOT
  ]
}

I use the aws_s3_object resource to upload to a S3 bucket as follows:

resource "aws_s3_object" "descriptor_files" {
  # Use for_each to handle multiple files
  for_each = fileset("${path.root}/libraries/proto_files", "**/*_descriptor.desc")

  bucket = aws_s3_bucket.data_model_bucket.id
  key    = each.value
  source = "${path.root}/libraries/proto_files/${each.value}"

  # Ensure this runs after the descriptor files are generated
  depends_on = [data.external.proto_generator]

  # Optional: Add content type and etag for caching
  content_type = "application/octet-stream"
  etag         = filemd5("${path.root}/libraries/proto_files/${each.value}")
}

The files should get generated in the plan phase and then get uploaded din the apply phase. I have put some print statement in the data external block but nothing gets printed as of now. The pipeline has no issues but no file upload happens nor can i see any logs in the plan or apply phase. What am i doing wrong ?

I am using a gitlab pipeline which uses terraform for deploying AWS resources. The pipeline configuration is pre-existing, hence i cannot change anything there. The pipeline run needs to generate some files using a shell script using data external and uses the generated files to upload to s3 using the resource aws_s3_object. If I run this locally on my machine then it works but does not work on the pipeline.

data "external" "proto_generator" {
  program = [
    "sh", "-c",
    <<-EOT
         echo "=== Starting Proto Generation ===" >&2

        # Redirect all installation output to stderr
        apk add --no-cache bash jq protobuf protobuf-dev >&2

        # Show current directory and contents
        pwd >&2
        ls -la ${path.root}/libraries/proto_files >&2

        # Execute proto generator script and redirect its output to stderr
        bash ${path.root}/scripts/proto_generator.sh >&2

        # List generated files for debugging
        echo "Generated files:" >&2
        ls -la ${path.root}/libraries/proto_files/*/*_descriptor.desc >&2

        # Only output the JSON object, nothing else
        printf '{"result":"success"}'
    EOT
  ]
}

I use the aws_s3_object resource to upload to a S3 bucket as follows:

resource "aws_s3_object" "descriptor_files" {
  # Use for_each to handle multiple files
  for_each = fileset("${path.root}/libraries/proto_files", "**/*_descriptor.desc")

  bucket = aws_s3_bucket.data_model_bucket.id
  key    = each.value
  source = "${path.root}/libraries/proto_files/${each.value}"

  # Ensure this runs after the descriptor files are generated
  depends_on = [data.external.proto_generator]

  # Optional: Add content type and etag for caching
  content_type = "application/octet-stream"
  etag         = filemd5("${path.root}/libraries/proto_files/${each.value}")
}

The files should get generated in the plan phase and then get uploaded din the apply phase. I have put some print statement in the data external block but nothing gets printed as of now. The pipeline has no issues but no file upload happens nor can i see any logs in the plan or apply phase. What am i doing wrong ?

Share Improve this question asked Mar 30 at 13:31 dsmdsm 655 bronze badges 3
  • 2 Do you use the artifacts keyword to save the files and the plan output? If yes, then I'm not sure what's wrong, but try adding another line in the script block of the step that runs terraform apply (like the ls -la prints you did, but outside the Terraform code) to see if the files are available to that step – towel Commented Mar 30 at 18:17
  • No i don't use artifacts. This would need changes in the pipeline setup right ? – dsm Commented Mar 30 at 20:19
  • In the terraform plan step you add an artifacts: block and put a paths: list inside it with all the outputs you wish to transfer to the terraform apply step. GitLab will then make the artifacts available to any step that comes after the plan step. – towel Commented Mar 31 at 19:56
Add a comment  | 

1 Answer 1

Reset to default 1

To demonstrate what I meant the artifacts keyword will instruct GitLab to save the listed files/directories for the next steps, for example:

stages:
  - plan
  - apply

image: hashicorp/terraform

plan:
  stage: plan
  script:
  - terraform init
  - terraform plan -out=tfplan
  artifacts:
    paths:
    - libraries/proto_files/**/*_descriptor.desc
    - tfplan

# Here we get the artifacts that were saved in the plan step
apply:
  stage: apply
  script:
  - terraform init
  - terraform apply -auto-approve tfplan
发布评论

评论列表(0)

  1. 暂无评论