gpt4 book ai didi

c# - 批量插入在 Azure SQL Server 中无法正常工作

转载 作者:行者123 更新时间:2023-11-30 14:24:18 26 4
gpt4 key购买 nike

我无法使用 C# webapi 将大量数据插入 Azure SQL Server DB

考虑

我想在SQL中插入60K>条数据。在我的本地 sql 服务器中没有问题,但在 Azure SQL 中连接超时

我的方法:(所有都在本地sql服务器中工作,但不在Azure sql服务器中)

1)尝试使用EF逐条插入记录(10000条大约10分钟,大部分超时)

2) 尝试使用 Bulk insert Extension与 EF 一起3)在SqlBulkCopy中尝试

4)尝试增加连接字符串中的连接超时

5) 尝试增加 Dbcontext 中的命令超时。

异常堆栈跟踪

Execution Timeout Expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.
System.Data.SqlClient.SqlException (0x80131904): Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding. ---> System.ComponentModel.Win32Exception (0x80004005): The wait operation timed out
at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)
at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error)
at System.Data.SqlClient.TdsParserStateObject.ReadSniSyncOverAsync()
at System.Data.SqlClient.TdsParserStateObject.TryReadNetworkPacket()
at System.Data.SqlClient.TdsParserStateObject.TryPrepareBuffer()
at System.Data.SqlClient.TdsParserStateObject.TryReadByte(Byte& value)
at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)
at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
at System.Data.SqlClient.SqlBulkCopy.RunParser(BulkCopySimpleResultSet bulkCopyHandler)
at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinuedOnSuccess(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinued(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsync(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestContinuedAsync(BulkCopySimpleResultSet internalResults, CancellationToken cts, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestAsync(CancellationToken cts, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalAsync(CancellationToken ctoken)
at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerAsync(Int32 columnCount, CancellationToken ctoken)
at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table, DataRowState rowState)

是否有任何解决方案或需要在 Azure 中更改任何配置?

更新

用于批量插入的代码

  using (var dbConnection = new DBModel().Database.Connection as SqlConnection)
{
dbConnection?.Open();
using (var sqlBulkCopy = new SqlBulkCopy(dbConnection))
{
try
{
/* ColumnMapping
* Column is mapped to DB Column to DataTable Column
*
*/
sqlBulkCopy.EnableStreaming = true;
sqlBulkCopy.BulkCopyTimeout = 500;
sqlBulkCopy.DestinationTableName = "LogTable";
//dt is object of the Datatable
sqlBulkCopy.WriteToServer(dt);
}
catch (Exception ex)
{

}
}


}

最佳答案

我建议您将 sqlBulkCopy.BatchSize 设置为合理的数量,而不是批量插入所有内容。根据您要插入的数据,尝试从 10.000 开始,然后逐步向上或向下调整,直到您对性能感到满意为止。

编辑一些额外的说明:当您考虑批量大小时,您需要考虑到 SqlBulkCopy 不仅需要插入数据,还需要读取并发送数据 - 最后一部分可能是它在本地 SQL 服务器上工作但在本地 SQL 服务器上不起作用的原因Azure - 这还意味着,如果您正在处理大型数据集,则需要使用较低的批处理大小或相当高的 BulkCopyTimeout 设置,以使每个批处理有机会在达到超时限制之前完成。

您可以在这篇文章中阅读有关批量大小的更多信息。 What is the recommended batch size for SqlBulkCopy?

其他选项:
我正在阅读此内容,这可能只是因为您的插入达到了关键的 DTU(数据库事务单元,基本上是服务器组合资源的度量)使用点。

Performance levels are calibrated and governed to provide the needed resources to run your database workload up to the max limits allowed for your selected service tier/performance level. If your workload is hitting the limits in one of CPU/Data IO/Log IO limits, you will continue to receive the resources at the maximum allowed level, but you are likely to see increased latencies for your queries. These limits will not result in any errors, but just a slowdown in your workload, unless the slowdown becomes so severe that queries start timing out.

取自此链接:https://azure.microsoft.com/da-dk/blog/azure-sql-database-introduces-new-near-real-time-performance-metrics/
尝试再次启动复制,同时监视 DTU 使用情况,看看它是否长时间处于 100%。如果是这种情况,您可能希望提高数据库的定价等级。

关于c# - 批量插入在 Azure SQL Server 中无法正常工作,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41697182/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com