I am attempting to read 10 million records from BigQuery, transform them, and create a .csv file. Afterwards, I am uploading this file to an SFTP server using Node.JS. Locally, this process takes 5-6 hours to complete.
When I deploy this solution to GCP Cloud Run, the container closes after 2-3 seconds with a 503 error. My configuration for Cloud Run is as follows:
Autoscaling: Up to 1 container
CPU allocated: default
Memory allocated: 2Gi
Concurrency: 10
Request timeout: 900 seconds
Is GCP Cloud Run a good option for long-running background processes?