我需要幫助加快我的應用程序請。該應用程序讀取大約53,000行分隔記錄,解析每一行,然後繼續將每行發送到數據庫以寫入。到目前爲止,我已經將數據庫端確定爲主要瓶頸,我希望能夠幫助您調整此問題。目前,大約需要20分鐘處理所有記錄(53,000),每個記錄有190個字段,我希望從將數據發送到數據庫的代碼開始顯着減少該數量。性能調優分隔文件讀取和寫入數據庫
我用企業庫5,服用連接池存在)的優勢,連接到數據庫,像這樣
internal void SaveItem(String connString)
{
try
{
ImportDataAccessor dbacess = new ImportDataAccessor(connString);
foreach (ItemDetail item in itemEZdetails)
{
if (dbacess.SaveProduct(RecordID, item))
{
updatecounter++;
}
}
successfulsaves = dbacess.RowsProcessed;
updatecounter = dbacess.TotalRows;
}
catch (Exception ex)
{
throw ex;
}
}
public bool SaveProduct(String RecordUID, ItemDetail item)
{
//. . . . start function here
DbCommand insertCommand = db.GetStoredProcCommand("IDW_spEZViewRecordImport");
db.AddInParameter(insertCommand, "SessionUID", DbType.String, recordUID);
// the other 189 Parameters go here
int rowsAffected = db.ExecuteNonQuery(insertCommand);
// Object sreturnval = (String)db.GetParameterValue(insertCommand, "ReturnVal");
String returnval = String.Empty;
if (! (db.GetParameterValue(insertCommand, "ReturnVal") == DBNull.Value))
returnval = (String)db.GetParameterValue(insertCommand, "ReturnVal");
if (returnval == "60")
RowsProcessed++;
result = rowsAffected > 0;
}
//end of line add
我怎樣我現在有代碼實現這一目標。提前致謝。輸入多個行時
Insert into ItemType (ItemTypeName)
VALUES
('TYPE1'),
('TYPE2'),
('TYPE3')
看看這裏http://stackoverflow.com/questions/7070011/writing-large-number-of-records-bulk-insert-to-access-in-net-c – tranceporter