2012-12-16 54 views
1

我試圖將大文件上傳到Amazon S3。我第一次使用PutObject,它工作正常,但花了大約5小時上傳一個2GB的文件。所以我閱讀了一些在線建議,並試用了TransferUtility使用C#將大文件傳輸到Amazon S3 - 請求已中止並取消

我增加了超時,但是這個TransferUtility API總是給我"The request was aborted. The request was canceled."錯誤。

代碼示例:

public void UploadWithMultiPart(string BucketName, string s3_key, string fileName) 
{ 
    var fileTransferUtility = new Amazon.S3.Transfer.TransferUtility(_accessKey, _secretKey); 
    var request = new Amazon.S3.Transfer.TransferUtilityUploadRequest() 
     .WithBucketName(BucketName) 
     .WithKey(s3_key) 
     .WithFilePath(fileName) 
     .WithTimeout(60*60*1000*100) 
     .WithPartSize(1024 * 1024 * 100) 
     .WithCannedACL(S3CannedACL.PublicRead) 
     .WithStorageClass(S3StorageClass.ReducedRedundancy); 

    request.Timeout = 60*60*1000*100; 

    fileKey = s3_key; 

    request.UploadProgressEvent += new EventHandler<UploadProgressArgs>(uploadRequest_UploadPartProgressEvent); 

    //.with = 30000 
    // .AddHeader("x-amz-acl", "public-read") 

    fileTransferUtility.Upload(request); 
} 

public void Upload(string BucketName, string s3_key, string fileName) 
{ 
    Amazon.S3.Model.PutObjectRequest request = new Amazon.S3.Model.PutObjectRequest(); 
    request.WithBucketName(BucketName); 
    request.WithKey(s3_key); 
    request.WithFilePath(fileName); 
    request.Timeout = -1; 
    request.ReadWriteTimeout = 30000; 
    request.AddHeader("x-amz-acl", "public-read"); 

    s3Client.PutObject(request); 
} 
+0

你是否已經通過這個解決方案了:http://stackoverflow.com/questions/3871430/how-to-upload-files- to-amazon-s3-official-sdk-that-than-5-mb-appro? – Amar

+0

那不是特別的帖子;但類似。我最終使用了TransferUtility + ConfigOptions將零件分成正確的尺寸 – Arcadian

回答

1

嘗試這個

private TransferUtility transferUtility; 
transferUtility = new TransferUtility(awsAccessKey, awsSecretKey); 

AsyncCallback callback = new AsyncCallback(UploadComplete); 


var putObjectRequest = new Amazon.S3.Transfer.TransferUtilityUploadRequest() 
{ 
    FilePath = filePath, 
    BucketName = awsBucketName, 
    Key = awsFilePath, 
    ContentType = contentType, 
    StorageClass = S3StorageClass.ReducedRedundancy, 
    ServerSideEncryptionMethod = ServerSideEncryptionMethod.AES256, 
    CannedACL = S3CannedACL.Private 
}; 
IAsyncResult ar = transferUtility.BeginUpload(putObjectRequest, callback, null); 
ThreadPool.QueueUserWorkItem(c => 
{ 
    transferUtility.EndUpload(ar); 
});