Overview
This document provides step-by-step instructions for transferring data between Google Cloud Storage (GCS) and Amazon S3 using code. The methods covered include using the AWS SDK, Google Cloud SDK, and the Python boto3 library for automated scripting.
.png)
Prerequisites
- An active Google Cloud Platform (GCP) account with access to Google Cloud Storage.
- An active AWS account with permissions to read from and write to Amazon S3.
- Google Cloud SDK (gcloud), version 3.13 for Python, installed.
- Python environment with the boto3 library installed for scripting.
- AWS server (e.g., Lambda) configured with the necessary dependencies and layers.
AWS Lambda Setup
To connect AWS Lambda with Google Cloud Storage (GCS), you need to add the Google Cloud SDK to your Lambda function. This is done by creating a Lambda Layer and attaching it to the function.

-
Navigate to the AWS Lambda Console:
- Open the AWS Console and go to Lambda.
-
Create a New Lambda Layer for the Google Cloud SDK (GCS SDK):
- In the left-hand menu, click on "Layers".
- Select "Create layer".
- Provide a name (e.g., gcs-sdk-layer).
- Upload a .zip file containing the Google Cloud SDK and any required dependencies.
- Select the appropriate runtime.
- Click "Create".
-
Create a Lambda Function:
- Go to the "Functions" section in the Lambda console.
- Click "Create function".
- Set the function name to dev01-GCP-Connect.
- Select the same runtime used when creating the layer.

-
Attach the GCS SDK Layer to the Lambda Function:
- After the function is created, scroll to the "Layers" section.
- Click "Add a layer".
- Select the custom layer you created (gcs-sdk-layer) and attach it to the function.

Code for Transferring Files
Here is a Python code that facilitates the transfer of files between GCP and S3 using Lambda:
The following Python script enables automated file transfer between Google Cloud Storage (GCS) and Amazon S3 using an AWS Lambda function. It leverages you to access GCS and the boto3 library to interact with Amazon S3. This script is designed to run within a Lambda environment that includes the necessary SDKs and credentials. It supports downloading a file from GCS and uploading it directly to an S3 bucket, streamlining cross-cloud data movement.
import json
import logging
import boto3
from google.cloud import storage
import os
import warnings
import botocore.exceptions
from azure.storage.blob import BlobServiceClient
# Initialize S3 client
s3 = boto3.client("s3")
# Set GCP credentials from environment variables or AWS Secrets Manager
# GCP_CREDENTIALS = json.loads(os.getenv("GCS_SERVICE_ACCOUNT_JSON", "{}"))
LOCAL_DESTINATION_PATH = "/tmp/local_file.txt"
def upload_to_gcs(bucket_name, destination_blob_name, file_data, auth_credentials):
"""Uploads a file to Google Cloud Storage"""
client = storage.Client.from_service_account_info(auth_credentials)
bucket = client.bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.upload_from_string(file_data)
print(f"File uploaded to GCS: {bucket_name}/{destination_blob_name}")
return True
def get_list_gcs_files(bucket_name, gcs_folder_path, auth_credentials, n):
"""Getting a file to Google Cloud Storage"""
client = storage.Client.from_service_account_info(auth_credentials)
# bucket = client.bucket(bucket_name)
# blobs = list(bucket.list_blobs())
blobs = list(client.list_blobs(bucket_name, prefix=gcs_folder_path))
sorted_blobs = sorted(blobs, key=lambda x: x.updated, reverse=True)
last_n_files = [blob.name for blob in sorted_blobs[:n]]
return last_n_files
def delete_list_gcs_files(bucket_name, gcs_folder_path, auth_credentials, n):
"""Deleting a file to Google Cloud Storage"""
client = storage.Client.from_service_account_info(auth_credentials)
bucket = client.bucket(bucket_name)
blobs = bucket.list_blobs(prefix=gcs_folder_path)
deleted_files = []
print("Deleting files:")
for blob in blobs:
blob.delete()
deleted_files.append(blob.name)
return deleted_files
def download_from_gcs(bucket_name, source_blob_name, auth_credentials):
"""Downloads a file from Google Cloud Storage"""
client = storage.Client.from_service_account_info(auth_credentials)
bucket = client.bucket(bucket_name)
blob = bucket.blob(source_blob_name)
content = blob.download_as_bytes()
print(f"File downloaded from GCS: {bucket_name}/{source_blob_name}")
return content
def download_from_gcs_for_azure(gcs_bucket, folder_path, file_name, auth_credentials):
"""Downloads a file from Google Cloud Storage"""
client = storage.Client.from_service_account_info(auth_credentials)
bucket = client.bucket(gcs_bucket)
# Construct full path inside GCS
blob_name = f"{folder_path}/{file_name}"
blob = bucket.blob(blob_name)
local_path = f"/tmp/{file_name}"
blob.download_to_filename(local_path)
print(f"File downloaded from GCS: {gcs_bucket}/{blob_name} → {local_path}")
return local_path
def upload_to_s3(bucket_name, file_name, file_data, auth_credentials):
"""Uploads a file to S3"""
s3.put_object(Bucket=bucket_name, Key=file_name, Body=file_data)
print(f"File uploaded to S3: {bucket_name}/{file_name}")
return True
def upload_to_azure(file_path, gcs_file, azureCredentials):
"""Uploads a file to Azure"""
# file_path = f"/tmp/{gcs_file}"
AZURE_STORAGE_ACCOUNT_NAME = azureCredentials.get("azure_storage_account_name", "")
AZURE_STORAGE_ACCOUNT_KEY = azureCredentials.get("azure_storage_account_key", "")
CONTAINER_NAME = azureCredentials.get("azure_container_name", "")
blob_service_client = BlobServiceClient(
account_url=f"https://{AZURE_STORAGE_ACCOUNT_NAME}.blob.core.windows.net",
credential=AZURE_STORAGE_ACCOUNT_KEY
)
container_client = blob_service_client.get_container_client(CONTAINER_NAME)
blob_client = container_client.get_blob_client(gcs_file)
with open(file_path, "rb") as file:
blob_client.upload_blob(file, overwrite=True)
print(f"File '{gcs_file}' uploaded successfully!")
return gcs_file
def download_from_s3(bucket_name, file_name, auth_credentials):
"""Downloads a file from S3"""
response = s3.get_object(Bucket=bucket_name, Key=file_name)
content = response["Body"].read()
print(f"File downloaded from S3: {bucket_name}/{file_name}")
return content
def download_from_azure(azureFile, azureCredentials):
"""Downloads a file from Azure"""
AZURE_STORAGE_ACCOUNT_NAME = azureCredentials.get("azure_storage_account_name", "")
AZURE_STORAGE_ACCOUNT_KEY = azureCredentials.get("azure_storage_account_key", "")
CONTAINER_NAME = azureCredentials.get("azure_container_name", "")
blob_service_client = BlobServiceClient(
account_url=f"https://{AZURE_STORAGE_ACCOUNT_NAME}.blob.core.windows.net",
credential=AZURE_STORAGE_ACCOUNT_KEY
)
container_client = blob_service_client.get_container_client(CONTAINER_NAME)
blob_client = container_client.get_blob_client(azureFile)
blob_data = blob_client.download_blob().readall()
print( f"File '{azureFile}' downloaded to /tmp/")
return blob_data
def write_to_gcs_from_local(bucket_name, destination_blob_name, local_source_path, auth_credentials):
"""Uploads a file to the GCS bucket after fectching the local file contents"""
with open(local_source_path, "rb") as file:
file_data = file.read()
client = storage.Client.from_service_account_info(auth_credentials)
bucket = client.bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.upload_from_string(file_data)
print(f"File uploaded to GCS: {bucket_name}/{destination_blob_name}")
return True
def read_from_gcs_to_local(bucket_name, destination_blob_name, local_source_path, auth_credentials):
# Get the bucket and file (blob)
client = storage.Client.from_service_account_info(auth_credentials)
bucket = client.bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.download_to_filename(local_source_path)
print(f"File {destination_blob_name} downloaded and saved as {local_source_path}")
# Optional: Read back the file contents
with open(local_source_path, "rb") as file:
data = file.read()
print("File content:", data[:100]) # Print first 100 bytes as preview
return True
def write_to_local_from_s3(s3_bucket_name, s3_file, local_source_path, auth_credentials):
file_data = download_from_s3(s3_bucket_name, s3_file, auth_credentials)
with open(local_source_path, "wb") as file:
file.write(file_data)
print(f"File {s3_file} downloaded from {s3_bucket_name} and saved as {local_source_path}")
with open(local_source_path, "rb") as file:
data = file.read()
print("File content:", data[:100]) # Print first 100 bytes as preview
return True
def upload_to_s3_from_local(s3_bucket_name, s3_file, local_source_path, auth_credentials):
with open(local_source_path, "rb") as file:
file_data = file.read()
print("File content:", file_data[:100]) # Print first 100 bytes as preview
upload_to_s3(s3_bucket_name, s3_file, file_data)
return True
def azure_list_blobs(azureCredentials):
"""List all files in the Azure Blob container."""
AZURE_STORAGE_ACCOUNT_NAME = azureCredentials.get("azure_storage_account_name", "")
AZURE_STORAGE_ACCOUNT_KEY = azureCredentials.get("azure_storage_account_key", "")
CONTAINER_NAME = azureCredentials.get("azure_container_name", "")
blob_service_client = BlobServiceClient(
account_url=f"https://{AZURE_STORAGE_ACCOUNT_NAME}.blob.core.windows.net",
credential=AZURE_STORAGE_ACCOUNT_KEY
)
container_client = blob_service_client.get_container_client(CONTAINER_NAME)
blobs = [blob.name for blob in container_client.list_blobs()]
return blobs
def lambda_handler(event, context):
try:
# Extract query parameters
query_params = event.get('queryStringParameters', {})
# Extract request body (if it's a POST request)
body = json.loads(event.get('body', '{}')) if event.get('body') else {}
auth_credentials = body.get("credentials")
file_payload = body.get("filePayload")
azureCredentials = body.get("azureCredentials")
# Extract HTTP method
method = event.get('httpMethod', 'GET')
action = file_payload.get("action") # "s3_to_gcs" or "gcs_to_s3"
s3_bucket = file_payload.get("s3_bucket", "")
s3_file = file_payload.get("s3_file", "")
gcs_bucket = file_payload.get("gcs_bucket", "")
gcs_file = file_payload.get("gcs_file", "")
gcs_folder_path = file_payload.get("gcs_folder_path", "")
azureFile = file_payload.get("azureFile", "")
gcs_folder = file_payload.get("gcs_folder", "")
gcs_file = file_payload.get("gcs_file", "")
# azure_storage_account_name = azureCredentials.get("azure_storage_account_name", "")
# azure_storage_account_key = azureCredentials.get("azure_storage_account_key", "")
# azure_container_name = azureCredentials.get("azure_container_name", "")
local_file = body.get("local_file", LOCAL_DESTINATION_PATH)
warnings.simplefilter("ignore", RuntimeWarning)
if method == "POST" :
if not action :
return {
"statusCode": 400,
"body": json.dumps({"message": "The Action must not be Empty."})
}
else :
if action == "s3_to_gcs" :
file_data = download_from_s3(s3_bucket, s3_file, auth_credentials)
if(upload_to_gcs(gcs_bucket, gcs_file, file_data, auth_credentials)):
return {
"statusCode": 200,
"body": json.dumps({"status":"success","message": f"File {s3_file} copied from S3 to GCS as {gcs_file}"})
}
else :
return {
"statusCode": 400,
"body": json.dumps({"status":"failed","message": f"Found issues in uploading {s3_file} from S3 to GCS as {gcs_file}"})
}
elif action == "azure_to_gcs" :
file_data = download_from_azure(azureFile, azureCredentials)
if(upload_to_gcs(gcs_bucket, gcs_file, file_data, auth_credentials)):
return {
"statusCode": 200,
"body": json.dumps({"status":"success","message": f"File {azureFile} copied from AZURE to GCS as {gcs_file}"})
}
else :
return {
"statusCode": 400,
"body": json.dumps({"status":"failed","message": f"Found issues in uploading {azureFile} from AZURE to GCS as {gcs_file}"})
}
elif action == "gcs_to_s3" :
file_data = download_from_gcs(gcs_bucket, gcs_file, auth_credentials)
if(upload_to_s3(s3_bucket, s3_file, file_data, auth_credentials)) :
return {
"statusCode": 200,
"body": json.dumps({"status":"success","message": f"File {gcs_file} copied from GCS to S3 as {s3_file}"})
}
else :
return {
"statusCode": 400,
"body": json.dumps({"status":"failed","message": f"Found issues in uploading {gcs_file} from GCS to S3 as {s3_file}"})
}
elif action == "gcs_to_azure" :
gcs_bucket = gcs_bucket
folder_path = gcs_folder
file_name = gcs_file
file_data = download_from_gcs_for_azure(gcs_bucket, folder_path, file_name, auth_credentials)
file_path = file_data
if(upload_to_azure(file_path, file_name, azureCredentials)) :
return {
"statusCode": 200,
"body": json.dumps({"status":"success","message": f"File {file_path} copied from GCS to Azure as {file_data}"})
}
else :
return {
"statusCode": 400,
"body": json.dumps({"status":"failed","message": f"Found issues in uploading {gcs_file} from GCS to Azure as {file_data}"})
}
elif action == "list_gcs_files" :
files = get_list_gcs_files(gcs_bucket, gcs_folder_path, auth_credentials, n=10)
if(files) :
return {
"statusCode": 200,
"body": json.dumps({"status":"success","gcsFile": files})
}
else :
return {
"statusCode": 400,
"body": json.dumps({"status":"failed","message": f"Found issues in getting the GCS file list"})
}
elif action == "delete_gcs_files" :
files = delete_list_gcs_files(gcs_bucket, gcs_folder_path, auth_credentials, n=10)
if(files) :
return {
"statusCode": 200,
"body": json.dumps({"status":"success","deletedGcsFiles": files})
}
else :
return {
"statusCode": 400,
"body": json.dumps({"status":"failed","message": f"Found issues in deleting the GCS file list"})
}
elif action == "list_azure_files" :
files = azure_list_blobs(azureCredentials)
if(files) :
return {
"statusCode": 200,
"body": json.dumps({"status":"success","azureFile": files})
}
else :
return {
"statusCode": 400,
"body": json.dumps({"status":"failed","message": f"Found issues in getting the GCS file list"})
}
elif action == "gcs_to_local" :
if(read_from_gcs_to_local(gcs_bucket, gcs_file, local_file, auth_credentials)):
return {
"statusCode": 200,
"body": json.dumps({"status":"success","message": f"File {gcs_file} copied from GCS to Local as {local_file}"})
}
else :
return {
"statusCode": 400,
"body": json.dumps({"status":"failed","message": f"Found issues in uploading {gcs_file} from GCS to Local as {local_file}"})
}
elif action == "local_to_gcs" :
if(write_to_gcs_from_local(gcs_bucket, gcs_file, local_file, auth_credentials)):
return {
"statusCode": 200,
"body": json.dumps({"status":"success","message": f"File {local_file} copied from local to GCS as {gcs_file}"})
}
else :
return {
"statusCode": 400,
"body": json.dumps({"status":"failed","message": f"Found issues in uploading {local_file} from local to GCS as {gcs_file}"})
}
elif action == "local_to_s3" :
if(upload_to_s3_from_local(gcs_bucket, gcs_file, local_file, auth_credentials)):
return {
"statusCode": 200,
"body": json.dumps({"status":"success","message": f"File {local_file} copied from local to S3 as {s3_file}"})
}
else :
return {
"statusCode": 400,
"body": json.dumps({"status":"failed","message": f"Found issues in uploading {local_file} from local to S3 as {s3_file}"})
}
elif action == "local_to_s3" :
if(upload_to_s3_from_local(s3_bucket, s3_file, local_file, auth_credentials)):
return {
"statusCode": 200,
"body": json.dumps({"status":"success","message": f"File {local_file} copied from S3 to local as {s3_file}"})
}
else :
return {
"statusCode": 400,
"body": json.dumps({"status":"failed","message": f"Found issues in uploading {local_file} from S3 to local as {s3_file}"})
}
elif action == "s3_to_local" :
if(write_to_local_from_s3(s3_bucket, s3_file, local_file, auth_credentials)):
return {
"statusCode": 200,
"body": json.dumps({"status":"success","message": f"File {s3_file} downloaded from S3 to local as {local_file}"})
}
else :
return {
"statusCode": 400,
"body": json.dumps({"status":"failed","message": f"Found issues in {s3_file} downloaded from S3 to local as {local_file}"})
}
else :
return {
"statusCode": 400,
"body": json.dumps({"status":"failed","message": "Internal server error for read and write file."})
}
else :
return {
"statusCode": 405,
"message": json.dumps({"status":"failed","message": "Method Not Allowed"})
}
except json.JSONDecodeError:
return {
"statusCode": 400,
"message": json.dumps({"status":"failed","message": "Invalid JSON format in request body"})
}
Request URL for Data Transfer
Use the following URL to initiate data transfer between Google Cloud Storage (GCP) and Amazon S3 in either direction.
URL: https://wz4jw9nt68.execute-api.us-east-2.amazonaws.com/develop/gcp-connect
Request Header
- Authorization Token: my-secret-token
Request Payload Examples
- Create file in S3 and store into GCP (with sub-directory).
{
"action": "s3_to_gcs",
"gcs_bucket": "annex-cloud-dev",
"gcs_file": "Snowflake/member_report.CSV",
"s3_bucket": "ac-gcp-data",
"s3_file": "Snowflake/member_report.CSV"
}- Create file in GCP and store into S3 (with sub-directory).
{
"action": "gcs_to_s3",
"gcs_bucket": "annex-cloud-dev",
"gcs_file": "Snowflake/member_report.CSV",
"s3_bucket": "ac-gcp-data",
"s3_file": "Snowflake/member_report.CSV"
}- Create file in GCP and store into S3 (bucket only).
{
"action": "gcs_to_s3",
"gcs_bucket": "annex-cloud-dev",
"gcs_file": "member_report.CSV",
"s3_bucket": "ac-gcp-data",
"s3_file": "member_report.CSV"
}- Create file in S3 and store into GCP (bucket only).
{
"action": "s3_to_gcs",
"gcs_bucket": "annex-cloud-dev",
"gcs_file": "member_report.CSV",
"s3_bucket": "ac-gcp-data",
"s3_file": "member_report.CSV"
}5. GCS to Azure.
{
"credentials": {
"type": "service_account",
"project_id": "manscaped-sre",
"site_id": "200000",
"private_key_id": "4ae4f33553d5938d9e82bec5757ec39b920e829e",
"private_key": "-----BEGIN PRIVATE KEY-----\nMIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQCW0rUUE7JOnJ8u\nsFRaxcbBfZo1Um1puaJ/PUOg4+pRyW6rJ0MKAyIPUkUTRHgbZMfRCNPUc/iNHzwj\nTIcL+uQ0CCioROo95ysx/FMlsLSmtGGb0InI4fHbqia2sO0OFXuUhq2AJy7fsTsv\nNzniBgrZXYMD9EWALl1xweD+tqef9jU5DMUJs6m/orbU8pQpN9odaauJkwClZ1wW\n5c6iev72Uz4p9yxx5hIrbr6NjQMD0a7+30Gn0R4K4mI6o2ZpX3WJcA53zn6em45C\nE0kppdV4RlBiJT/U9vBwXCjDqhPrhQw7Og9Vvdqfx6CXvgK6uaxHQzgOIcVfTAfz\n/1alk+cNAgMBAAECggEADY7wl1bz8wwyZSyV0Lkx1mgXqk54JoZ5HK4hu79xa5vL\nlphHdwl0EOOI8SP5FiFXsytze4hXYjxaGCRDiaiqxiCrowDa++ihofM4eE1r2Aak\nNbEoCcaCCwi/RTIfhWIaA26d/0TGenaAxwxUaES5yfOVmQ/nBZBldxoCQ3OhGwiQ\nSQ9RIE/ABIZoxvslEV1lmltRpsnb8o6rexHuj7P918Z9CDrepqthfK+Ahudnak5g\nKwCLkHCSnSw/5dEz+P50In/qTJGMT6Vr+VxQYW1fjTXzkdS34jtgECb20z5usowT\nu2QVYZGfASOcZkhy4JbGInB3mTxTEfo3qke3P3S1kwKBgQDFD+SYV4VhGz0a6DsG\nfCzKK5iehMsVRq3vkWmfe3S7kbGCh4f6PaIzIpWfQsiG7GmSpaEaI/LJTEKhIZ75\nJXn0RuGqhcl6JOFYFQYf3SPe4BG+MplAtUJBzoeSHo6LH/7u4FzfFXg42Mq9gc62\nuDzfPvXmd9tv6Bs3GF6TNUjGCwKBgQDD7oNwNyweWduBDdpVwJOc6jzQAb0iCWwO\nId/LDcxLs9/mZ2iTJRO9qQOnJnG8yHDjpgOT3UmibUl59pOtSKhiK/sCLZl6Wn8n\nkE5UWYhJWJdC1E5UCDVgHMckEtXh9dWp9+i8KUka3i+5hy2qeUIWQyHXx/G3+xmK\ndxAAHM0uRwKBgHx7YKTXTJsd8MipgHfFQynmxj8ElaD4B/H4wmcLPp8qFp7k7IGo\nI2j35No4/qE0gDAxzoXLxZdhRAmzSlAKW3JywCTO5Iny+CKDDV5dfEZS9wJVxjd5\nCMS3KS6lfNfnu0u8kQ4e6tXGJLP3ZtRHp7RCemU+u3CCh4aTL1MAatsnAoGBAL3p\n3vVSRS1WI/G/n7Ym5+3dDf0A8nafc2FvbCDByxhFzezipvaZpbzcqnHGTdCS6Pl/\n3U/h2pHaJLJXU2VPXAdsYe0GjhGOzllnAsW30uZlPJjGePXyzunOeyh4KWDQjL4n\nUiuwSPAGFXRbluP6jRhPEeq6H44ZkfQo3BV/1VHZAoGBAKsL5oUd/KVsXMyVIbTN\nBwyUDhfZM0GVYDM36etD6mT0Tu+boo7/PAYOq0lwpq/t0/epdaaaK3te+qKAIZDo\nNtOWxW8Y/vjkWgcimF5I+zJSELutvePblop7CUSLGVEPresR2U3sJWGXN3wBFF9y\np+PH0XlsDJqGVlSh0JvXACZ7\n-----END PRIVATE KEY-----\n",
"client_email": "annex-cloud-dev-bucket-access@manscaped-sre.iam.gserviceaccount.com",
"client_id": "112220529657587388922",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/annex-cloud-dev-bucket-access%40manscaped-sre.iam.gserviceaccount.com",
"universe_domain": "googleapis.com"
},
"azureCredentials": {
"azure_storage_account_name": "azeupreprodcrongcpfiles",
"azure_storage_account_key": "Xkb7C5fiIAAOYpyTa3HRqRc340E91PXLBFJZpl6BX4OKf3JW/RkAjX7pd13anbJDx8ignl/+YbGN+ASt2UDPfg==",
"azure_container_name": "az-eu-preprod-cron-client-gcp-files"
},
"filePayload": {
"action": "gcs_to_azure",
"gcs_bucket": "annex-cloud-dev",
"gcs_folder": "Snowflake",
"gcs_file": "Manscaped-ProductCatalog-2025-05-07-test02.csv"
}
}
6. Azure to GCS.
{
"credentials": {
"type": "service_account",
"project_id": "manscaped-sre",
"site_id": "200000",
"private_key_id": "4ae4f33553d5938d9e82bec5757ec39b920e829e",
"private_key": "-----BEGIN PRIVATE KEY-----\nMIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQCW0rUUE7JOnJ8u\nsFRaxcbBfZo1Um1puaJ/PUOg4+pRyW6rJ0MKAyIPUkUTRHgbZMfRCNPUc/iNHzwj\nTIcL+uQ0CCioROo95ysx/FMlsLSmtGGb0InI4fHbqia2sO0OFXuUhq2AJy7fsTsv\nNzniBgrZXYMD9EWALl1xweD+tqef9jU5DMUJs6m/orbU8pQpN9odaauJkwClZ1wW\n5c6iev72Uz4p9yxx5hIrbr6NjQMD0a7+30Gn0R4K4mI6o2ZpX3WJcA53zn6em45C\nE0kppdV4RlBiJT/U9vBwXCjDqhPrhQw7Og9Vvdqfx6CXvgK6uaxHQzgOIcVfTAfz\n/1alk+cNAgMBAAECggEADY7wl1bz8wwyZSyV0Lkx1mgXqk54JoZ5HK4hu79xa5vL\nlphHdwl0EOOI8SP5FiFXsytze4hXYjxaGCRDiaiqxiCrowDa++ihofM4eE1r2Aak\nNbEoCcaCCwi/RTIfhWIaA26d/0TGenaAxwxUaES5yfOVmQ/nBZBldxoCQ3OhGwiQ\nSQ9RIE/ABIZoxvslEV1lmltRpsnb8o6rexHuj7P918Z9CDrepqthfK+Ahudnak5g\nKwCLkHCSnSw/5dEz+P50In/qTJGMT6Vr+VxQYW1fjTXzkdS34jtgECb20z5usowT\nu2QVYZGfASOcZkhy4JbGInB3mTxTEfo3qke3P3S1kwKBgQDFD+SYV4VhGz0a6DsG\nfCzKK5iehMsVRq3vkWmfe3S7kbGCh4f6PaIzIpWfQsiG7GmSpaEaI/LJTEKhIZ75\nJXn0RuGqhcl6JOFYFQYf3SPe4BG+MplAtUJBzoeSHo6LH/7u4FzfFXg42Mq9gc62\nuDzfPvXmd9tv6Bs3GF6TNUjGCwKBgQDD7oNwNyweWduBDdpVwJOc6jzQAb0iCWwO\nId/LDcxLs9/mZ2iTJRO9qQOnJnG8yHDjpgOT3UmibUl59pOtSKhiK/sCLZl6Wn8n\nkE5UWYhJWJdC1E5UCDVgHMckEtXh9dWp9+i8KUka3i+5hy2qeUIWQyHXx/G3+xmK\ndxAAHM0uRwKBgHx7YKTXTJsd8MipgHfFQynmxj8ElaD4B/H4wmcLPp8qFp7k7IGo\nI2j35No4/qE0gDAxzoXLxZdhRAmzSlAKW3JywCTO5Iny+CKDDV5dfEZS9wJVxjd5\nCMS3KS6lfNfnu0u8kQ4e6tXGJLP3ZtRHp7RCemU+u3CCh4aTL1MAatsnAoGBAL3p\n3vVSRS1WI/G/n7Ym5+3dDf0A8nafc2FvbCDByxhFzezipvaZpbzcqnHGTdCS6Pl/\n3U/h2pHaJLJXU2VPXAdsYe0GjhGOzllnAsW30uZlPJjGePXyzunOeyh4KWDQjL4n\nUiuwSPAGFXRbluP6jRhPEeq6H44ZkfQo3BV/1VHZAoGBAKsL5oUd/KVsXMyVIbTN\nBwyUDhfZM0GVYDM36etD6mT0Tu+boo7/PAYOq0lwpq/t0/epdaaaK3te+qKAIZDo\nNtOWxW8Y/vjkWgcimF5I+zJSELutvePblop7CUSLGVEPresR2U3sJWGXN3wBFF9y\np+PH0XlsDJqGVlSh0JvXACZ7\n-----END PRIVATE KEY-----\n",
"client_email": "annex-cloud-dev-bucket-access@manscaped-sre.iam.gserviceaccount.com",
"client_id": "112220529657587388922",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/annex-cloud-dev-bucket-access%40manscaped-sre.iam.gserviceaccount.com",
"universe_domain": "googleapis.com"
},
"azureCredentials": {
"azure_storage_account_name": "azeupreprodcrongcpfiles",
"azure_storage_account_key": "Xkb7C5fiIAAOYpyTa3HRqRc340E91PXLBFJZpl6BX4OKf3JW/RkAjX7pd13anbJDx8ignl/+YbGN+ASt2UDPfg==",
"azure_container_name": "az-eu-preprod-cron-client-gcp-files"
},
"filePayload": {
"action": "azure_to_gcs",
"gcs_bucket": "annex-cloud-dev",
"gcs_file": "Snowflake/Manscaped-ProductCatalog-2025-05-07-test02.csv",
"azureFile": "Manscaped-ProductCatalog-2025-05-07-test02.csv"
}
}7. Delete GCP Files.
{
"credentials": {
"type": "service_account",
"project_id": "manscaped-sre",
"site_id": "200000",
"private_key_id": "4ae4f33553d5938d9e82bec5757ec39b920e829e",
"private_key": "-----BEGIN PRIVATE KEY-----\nMIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQCW0rUUE7JOnJ8u\nsFRaxcbBfZo1Um1puaJ/PUOg4+pRyW6rJ0MKAyIPUkUTRHgbZMfRCNPUc/iNHzwj\nTIcL+uQ0CCioROo95ysx/FMlsLSmtGGb0InI4fHbqia2sO0OFXuUhq2AJy7fsTsv\nNzniBgrZXYMD9EWALl1xweD+tqef9jU5DMUJs6m/orbU8pQpN9odaauJkwClZ1wW\n5c6iev72Uz4p9yxx5hIrbr6NjQMD0a7+30Gn0R4K4mI6o2ZpX3WJcA53zn6em45C\nE0kppdV4RlBiJT/U9vBwXCjDqhPrhQw7Og9Vvdqfx6CXvgK6uaxHQzgOIcVfTAfz\n/1alk+cNAgMBAAECggEADY7wl1bz8wwyZSyV0Lkx1mgXqk54JoZ5HK4hu79xa5vL\nlphHdwl0EOOI8SP5FiFXsytze4hXYjxaGCRDiaiqxiCrowDa++ihofM4eE1r2Aak\nNbEoCcaCCwi/RTIfhWIaA26d/0TGenaAxwxUaES5yfOVmQ/nBZBldxoCQ3OhGwiQ\nSQ9RIE/ABIZoxvslEV1lmltRpsnb8o6rexHuj7P918Z9CDrepqthfK+Ahudnak5g\nKwCLkHCSnSw/5dEz+P50In/qTJGMT6Vr+VxQYW1fjTXzkdS34jtgECb20z5usowT\nu2QVYZGfASOcZkhy4JbGInB3mTxTEfo3qke3P3S1kwKBgQDFD+SYV4VhGz0a6DsG\nfCzKK5iehMsVRq3vkWmfe3S7kbGCh4f6PaIzIpWfQsiG7GmSpaEaI/LJTEKhIZ75\nJXn0RuGqhcl6JOFYFQYf3SPe4BG+MplAtUJBzoeSHo6LH/7u4FzfFXg42Mq9gc62\nuDzfPvXmd9tv6Bs3GF6TNUjGCwKBgQDD7oNwNyweWduBDdpVwJOc6jzQAb0iCWwO\nId/LDcxLs9/mZ2iTJRO9qQOnJnG8yHDjpgOT3UmibUl59pOtSKhiK/sCLZl6Wn8n\nkE5UWYhJWJdC1E5UCDVgHMckEtXh9dWp9+i8KUka3i+5hy2qeUIWQyHXx/G3+xmK\ndxAAHM0uRwKBgHx7YKTXTJsd8MipgHfFQynmxj8ElaD4B/H4wmcLPp8qFp7k7IGo\nI2j35No4/qE0gDAxzoXLxZdhRAmzSlAKW3JywCTO5Iny+CKDDV5dfEZS9wJVxjd5\nCMS3KS6lfNfnu0u8kQ4e6tXGJLP3ZtRHp7RCemU+u3CCh4aTL1MAatsnAoGBAL3p\n3vVSRS1WI/G/n7Ym5+3dDf0A8nafc2FvbCDByxhFzezipvaZpbzcqnHGTdCS6Pl/\n3U/h2pHaJLJXU2VPXAdsYe0GjhGOzllnAsW30uZlPJjGePXyzunOeyh4KWDQjL4n\nUiuwSPAGFXRbluP6jRhPEeq6H44ZkfQo3BV/1VHZAoGBAKsL5oUd/KVsXMyVIbTN\nBwyUDhfZM0GVYDM36etD6mT0Tu+boo7/PAYOq0lwpq/t0/epdaaaK3te+qKAIZDo\nNtOWxW8Y/vjkWgcimF5I+zJSELutvePblop7CUSLGVEPresR2U3sJWGXN3wBFF9y\np+PH0XlsDJqGVlSh0JvXACZ7\n-----END PRIVATE KEY-----\n",
"client_email": "annex-cloud-dev-bucket-access@manscaped-sre.iam.gserviceaccount.com",
"client_id": "112220529657587388922",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/annex-cloud-dev-bucket-access%40manscaped-sre.iam.gserviceaccount.com",
"universe_domain": "googleapis.com"
},
"filePayload": {
"action": "delete_gcs_files",
"gcs_bucket": "annex-cloud-dev",
"gcs_folder_path": "Snowflake/test_data_feed_gcp_20250522.csv"
}
}
Code to Check GCP Authentication
The following Python code verifies authentication with Google Cloud Platform (GCP) from within an AWS Lambda function. It uses the google-cloud-storage library to initiate a simple request. This is useful for debugging or validating GCP setup before performing operations like data transfers.
import json
import logging
import boto3
from google.cloud import storage
from google.oauth2 import service_account
from google.auth.transport.requests import Request
import os
import warnings
import botocore.exceptions
def lambda_handler(event, context):
body_content = event["body"]
# event_string = json.dumps(body_content)
event_string = json.loads(body_content)
print(event_string)
try:
if "token_uri" not in event_string:
return {"status": "error455", "message": "GCP credentials not found in event"}
credentials_dict = event_string
scopes = ["https://www.googleapis.com/auth/cloud-platform"]
credentials = service_account.Credentials.from_service_account_info(credentials_dict, scopes=scopes)
credentials.refresh(Request())
# return {"status": "success", "message": "GCP authentication successful"}
return {
"statusCode": 200,
"body": json.dumps({"status": "Success", "message": "GCP authentication successful"})
}
# except Exception as e:
# return {"status": "error123", "message": str(e)}
except Exception as e:
print("Error:", str(e))
return {
"statusCode": 500,
"body": json.dumps({"status": "Failed", "message": str(e)})
}Request URL to Verify GCP Authentication
Use the following URL to verify GCP authentication.
URL: https://wn3lyyw9ml.execute-api.us-east-2.amazonaws.com/develop/gcp-auth
Request Header
- Authorization Token: m_JEBuP5ewL
Request Payload Example:
{
"type": "service_account",
"project_id": "manscaped-sre",
"site_id": "200000",
"private_key_id": "4ae4f33553d5938d9e82bec5757ec39b920e829e",
"private_key": "-----BEGIN PRIVATE KEY-----\nMIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQCW0rUUE7JOnJ8u\nsFRaxcbBfZo1Um1puaJ/PUOg4+pRyW6rJ0MKAyIPUkUTRHgbZMfRCNPUc/iNHzwj\nTIcL+uQ0CCioROo95ysx/FMlsLSmtGGb0InI4fHbqia2sO0OFXuUhq2AJy7fsTsv\nNzniBgrZXYMD9EWALl1xweD+tqef9jU5DMUJs6m/orbU8pQpN9odaauJkwClZ1wW\n5c6iev72Uz4p9yxx5hIrbr6NjQMD0a7+30Gn0R4K4mI6o2ZpX3WJcA53zn6em45C\nE0kppdV4RlBiJT/U9vBwXCjDqhPrhQw7Og9Vvdqfx6CXvgK6uaxHQzgOIcVfTAfz\n/1alk+cNAgMBAAECggEADY7wl1bz8wwyZSyV0Lkx1mgXqk54JoZ5HK4hu79xa5vL\nlphHdwl0EOOI8SP5FiFXsytze4hXYjxaGCRDiaiqxiCrowDa++ihofM4eE1r2Aak\nNbEoCcaCCwi/RTIfhWIaA26d/0TGenaAxwxUaES5yfOVmQ/nBZBldxoCQ3OhGwiQ\nSQ9RIE/ABIZoxvslEV1lmltRpsnb8o6rexHuj7P918Z9CDrepqthfK+Ahudnak5g\nKwCLkHCSnSw/5dEz+P50In/qTJGMT6Vr+VxQYW1fjTXzkdS34jtgECb20z5usowT\nu2QVYZGfASOcZkhy4JbGInB3mTxTEfo3qke3P3S1kwKBgQDFD+SYV4VhGz0a6DsG\nfCzKK5iehMsVRq3vkWmfe3S7kbGCh4f6PaIzIpWfQsiG7GmSpaEaI/LJTEKhIZ75\nJXn0RuGqhcl6JOFYFQYf3SPe4BG+MplAtUJBzoeSHo6LH/7u4FzfFXg42Mq9gc62\nuDzfPvXmd9tv6Bs3GF6TNUjGCwKBgQDD7oNwNyweWduBDdpVwJOc6jzQAb0iCWwO\nId/LDcxLs9/mZ2iTJRO9qQOnJnG8yHDjpgOT3UmibUl59pOtSKhiK/sCLZl6Wn8n\nkE5UWYhJWJdC1E5UCDVgHMckEtXh9dWp9+i8KUka3i+5hy2qeUIWQyHXx/G3+xmK\ndxAAHM0uRwKBgHx7YKTXTJsd8MipgHfFQynmxj8ElaD4B/H4wmcLPp8qFp7k7IGo\nI2j35No4/qE0gDAxzoXLxZdhRAmzSlAKW3JywCTO5Iny+CKDDV5dfEZS9wJVxjd5\nCMS3KS6lfNfnu0u8kQ4e6tXGJLP3ZtRHp7RCemU+u3CCh4aTL1MAatsnAoGBAL3p\n3vVSRS1WI/G/n7Ym5+3dDf0A8nafc2FvbCDByxhFzezipvaZpbzcqnHGTdCS6Pl/\n3U/h2pHaJLJXU2VPXAdsYe0GjhGOzllnAsW30uZlPJjGePXyzunOeyh4KWDQjL4n\nUiuwSPAGFXRbluP6jRhPEeq6H44ZkfQo3BV/1VHZAoGBAKsL5oUd/KVsXMyVIbTN\nBwyUDhfZM0GVYDM36etD6mT0Tu+boo7/PAYOq0lwpq/t0/epdaaaK3te+qKAIZDo\nNtOWxW8Y/vjkWgcimF5I+zJSELutvePblop7CUSLGVEPresR2U3sJWGXN3wBFF9y\np+PH0XlsDJqGVlSh0JvXACZ7\n-----END PRIVATE KEY-----\n",
"client_email": "annex-cloud-dev-bucket-access@manscaped-sre.iam.gserviceaccount.com",
"client_id": "112220529657587388922",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/annex-cloud-dev-bucket-access%40manscaped-sre.iam.gserviceaccount.com",
"universe_domain": "googleapis.com"
}