我想上傳〜3K文件(每1千字節)使用GreenPool到博託。博託 - 慢set_content_from_file時get_bucket驗證是假
我的問題:
爲什麼get_bucket()
調用將每次通話這麼長時間,有什麼用set_content()
時間使權衡?我怎樣才能解決它。謝謝!
更多細節:
get_bucket(validate=True)
發生在平均30秒,以下set_content_from_file_name
下1秒發生。我試圖改變到
validate=False
,這get_bucket()
時間成功地降低至下1秒,但然後set_content_from_file_name
時間跳起來〜30秒。我在boto docs找不到這種折衷的原因。
代碼:
def upload(bucket_str, key_str, file_path):
# new s3 connection
s3 = boto.connect_s3()
# get bucket
bucket_time = time.time()
b = s3.get_bucket (bucket_name, validate=True)
logging.info('get_bucket Took %f seconds'%(time.time()-bucket_time))
# get key
key_time = time.time()
key = mapping_bucket.new_key(key_str)
logging.info('new_key Took %f seconds'%(time.time()-key_time))
for i in range(S3_TRIES):
try:
up_time = time.time()
key.set_contents_from_filename (file_path,
headers={
"Content-Encoding": "gzip",
"Content-Type": "application/json",
},
policy='public-read')
logging.info('set_content Took %f seconds'%(time.time()-up_time))
key.set_acl('public-read')
return True
except Exception as e:
logging.info('try_set_content exception iteration - %d, %s'%(i, str(e)))
_e = e
raise _e