我遇到一個問題,寫入大數據量到SQL Server上的FILESTREAM列。具體來說,1.5-2GB左右的小文件處理得很好,但是當大小達到6GB時,我得到間歇IOException
在轉移結束時.CopyTo()
「處理無效」。如何將大文件寫入SQL Server FILESTREAM?
我想過要以塊的形式編寫數據,但SQL Server在允許將數據附加到數據之前會複製字段的備份文件,這會完全破壞大文件的性能。
下面的代碼:
public long AddFragment (string location , string description = null)
{
const string sql =
@"insert into [Fragment] ([Description],[Data]) " +
"values (@description,0x); " +
"select [Id], [Data].PathName(), " +
"GET_FILESTREAM_TRANSACTION_CONTEXT() " +
"from " +
"[Fragment] " +
"where " +
"[Id] = SCOPE_IDENTITY();";
long id;
using (var scope = new TransactionScope(
TransactionScopeOption.Required,
new TransactionOptions {
Timeout = TimeSpan.FromDays(1)
}))
{
using (var connection = new SqlConnection(m_ConnectionString))
{
connection.Open();
byte[] serverTx;
string serverLocation;
using (var command = new SqlCommand (sql, connection))
{
command.Parameters.Add("@description",
SqlDbType.NVarChar).Value = description;
using (var reader = command.ExecuteReader())
{
reader.Read();
id = reader.GetSqlInt64(0).Value;
serverLocation = reader.GetSqlString (1).Value;
serverTx = reader.GetSqlBinary (2).Value;
}
}
using (var source = new FileStream(location, FileMode.Open,
FileAccess.Read, FileShare.Read, 4096,
FileOptions.SequentialScan))
using (var target = new SqlFileStream(serverLocation,
serverTx, FileAccess.Write))
{
source.CopyTo (target);
}
}
scope.Complete();
}
return id;
}
也許看http://msdn.microsoft.com/en-us/library/bb933972(v=sql.105 ).aspx – Paparazzi
這裏有一個通用的例子,它將_9_字節寫入字段。 – chase
也許連接超時。例如,你嘗試增加連接超時嗎?順便說一句,只是想知道,爲什麼你使用SQL Server的文件很大?這些日子是否成爲SQL Server的常見做法? – ziya