I have files that i need to copy across to AWS S3 bit bucket. At the moment i copy them to an Azure storage account and then I download them to a vm's drive lets say d:\dloaddata and then i copy them to AWS from the vm harddrive.
I am looking for a faster way of doing this , maybe copy to the Azure Storage account and then straight to AWS and by passs the dload to a vm as this costs .
Is there a way i can do this ?
I have files that i need to copy across to AWS S3 bit bucket. At the moment i copy them to an Azure storage account and then I download them to a vm's drive lets say d:\dloaddata and then i copy them to AWS from the vm harddrive.
I am looking for a faster way of doing this , maybe copy to the Azure Storage account and then straight to AWS and by passs the dload to a vm as this costs .
Is there a way i can do this ?
Share Improve this question asked yesterday Wendy Wendy 157 bronze badges 1- You can try Azure data factory to copy files from Azure storage to AWS bucket. – Venkatesan Commented yesterday
1 Answer
Reset to default 0I am looking for a faster way of doing this , maybe copy to the Azure Storage account and then straight to AWS and by passs the dload to a vm as this costs .
You can use this MS-Document to copy a data from Azure storage account to AWS bucket using Azure data factory
.
Also, you can use the below python code to copy a data from azure blob storage to AWS bucket using with the boto3
(AWS) and azure-storage-blob
(Azure) SDKs.
import os
from azure.storage.blob import BlobServiceClient
import boto3
# Azure Blob Storage credentials
azure_account_name = 'xxx'
azure_account_key = 'xxxxx'
azure_container_name = 'test'
azure_blob_name ='sample.txt'
#AWS S3 credentials
aws_access_key_id = '<your_aws_access_key_id>'
aws_secret_access_key = '<your_aws_secret_access_key>'
aws_bucket_name = '<your_aws_bucket_name>'
aws_object_key = '<your_aws_object_key>'
blob_service_client = BlobServiceClient(account_url=f"https://{azure_account_name}.blob.core.windows", credential=azure_account_key)
blob_client = blob_service_client.get_blob_client(container=azure_container_name, blob=azure_blob_name)
stream = blob_client.download_blob()
s3_client = boto3.client('s3', aws_access_key_id=aws_access_key_id, aws_secret_access_key=aws_secret_access_key)
s3_client.put_object(Bucket=aws_bucket_name, Key=aws_object_key, Body=stream)
The above code helps you to copy data from azure storage to AWS bucket, but the above code will work smaller files easier.
Reference:
Batch transfer to AWS - Microsoft Q&A by AnnuKumari-MSFT