how to transfer file to azure blob storage in chunks without writing to file using python -
i need transfer files google cloud storage azure blob storage.
google gives code snippet download files byte variable so:
# payload data req = client.objects().get_media( bucket=bucket_name, object=object_name, generation=generation) # optional # bytesio object may replaced io.base instance. fh = io.bytesio() downloader = mediaiobasedownload(fh, req, chunksize=1024*1024) done = false while not done: status, done = downloader.next_chunk() if status: print 'download %d%%.' % int(status.progress() * 100) print 'download complete!' print fh.getvalue()
i able modify store file changing fh object type so:
fh = open(object_name, 'wb')
then can upload azure blob storage using blob_service.put_block_blob_from_path
.
i want avoid writing local file on machine doing transfer.
i gather google's snippet loads data io.bytesio() object chunk @ time. reckon should use write blob storage chunk @ time.
i experimented reading whole thing memory, , uploading using put_block_blob_from_bytes
, got memory error (file big (~600mb).
any suggestions?
according source codes of blobservice.py
azure storage , blobreader
google cloud storage, can try use azure function blobservice.put_block_blob_from_file
write stream gcs class blobreader
has function read
stream, please see below.
so refering code https://cloud.google.com/appengine/docs/python/blobstore/#python_using_blobreader, can try below.
from google.appengine.ext import blobstore azure.storage.blob import blobservice blob_key = ... blob_reader = blobstore.blobreader(blob_key) blob_service = blobservice(account_name, account_key) container_name = ... blob_name = ... blobservice.put_block_blob_from_file(container_name, blob_name, blob_reader)
Comments
Post a Comment