1
我正在使用以下程序將大文件上傳到azure blob存儲。 當上傳一個小文件小於500KB,程序工作正常,否則 我在下面的行收到一個錯誤:將大文件上傳到azure blob
blob.PutBlock(blockIdBase64,流,NULL);
爲「類型‘Microsoft.WindowsAzure.Storage.StorageException’的未處理的異常在Microsoft.WindowsAzure.Storage.dll 發生其他信息:遠程服務器返回一個錯誤:(400)錯誤的請求」
沒有關於異常的細節,所以我不確定最新的問題。有沒有關於這可能是錯誤的東西在下面的程序中的任何建議:
class Program
{
static void Main(string[] args)
{
string accountName = "newstg";
string accountKey = "fFB86xx5jbCj1A3dC41HtuIZwvDwLnXg==";
// list of all uploaded block ids. need for commiting them at the end
var blockIdList = new List<string>();
StorageCredentials creds = new StorageCredentials(accountName, accountKey);
CloudStorageAccount storageAccount = new CloudStorageAccount(creds, useHttps: true);
CloudBlobClient client = storageAccount.CreateCloudBlobClient();
CloudBlobContainer sampleContainer = client.GetContainerReference("newcontainer2");
string fileName = @"C:\sample.pptx";
CloudBlockBlob blob = sampleContainer.GetBlockBlobReference("APictureFile6");
using (var file = new FileStream(fileName, FileMode.Open, FileAccess.Read))
{
int blockSize = 1;
// block counter
var blockId = 0;
// open file
while (file.Position < file.Length)
{
// calculate buffer size (blockSize in KB)
var bufferSize = blockSize * 1024 < file.Length - file.Position ? blockSize * 1024 : file.Length - file.Position;
var buffer = new byte[bufferSize];
// read data to buffer
file.Read(buffer, 0, buffer.Length);
// save data to memory stream and put to storage
using (var stream = new MemoryStream(buffer))
{
// set stream position to start
stream.Position = 0;
// convert block id to Base64 Encoded string
var blockIdBase64 = Convert.ToBase64String(Encoding.UTF8.GetBytes(blockId.ToString(CultureInfo.InvariantCulture)));
blob.PutBlock(blockIdBase64, stream, null);
blockIdList.Add(blockIdBase64);
// increase block id
blockId++;
}
}
file.Close();
}
blob.PutBlockList(blockIdList);
}
}
爲什麼你仍然使用低級別的PutBlock API? Azure庫有更簡單的上傳數據的方法。 – usr
@usr:你能否給我推薦任何使用流並可能使用chunking代替緩衝區來避免內存問題的庫? – user1400915
Google針對「Azure .NET上傳文件」。這是一個解決的問題。 – usr