I am trying to upload a .zip to cloud run function which contains main.py and a other files along with a token.pickle which save 0auth info. But after the cloud run function is created, I'm unable to see the pickle file in source and also not able to use it . Image
I rechecked and I'm indeed zipping that and uploading it as zip but can't see in the source of cloud run function
I am trying to upload a .zip to cloud run function which contains main.py and a other files along with a token.pickle which save 0auth info. But after the cloud run function is created, I'm unable to see the pickle file in source and also not able to use it . Image
I rechecked and I'm indeed zipping that and uploading it as zip but can't see in the source of cloud run function
Share Improve this question edited Feb 8 at 0:24 Humble asked Feb 8 at 0:22 Humble Humble 32 bronze badges2 Answers
Reset to default 0The problem, and the strength, with Cloud Run Functions is that is a managed service: Container build and deployment is included in the service. Less things to do, but less control on what it does!
Therefore, when you submit sources (zipped or not), Cloud Run Functions build mechanism (with buildpack) will pick only the relevant files for building its container. In python, the .py files are relevant, the requirements.txt also. but a token.pickle absolutely not.
The way to solve this is to get more control on the container construction and use Cloud Run Service (not function) and a dockerfile to build your own container with the file and constraint you want.
This can be only fixed by storing token.pickle file in gcs bucket or secret mananger and then calling it from cloud run function. This is because probably cloud run does't allow to store certain files and also its stateless.