I have a similar setup.
I am using a stored procedure with the Table-Valued parameter and MERGE . See Also Tabular Parameters , for example, how to use them in .NET.
I would switch the focus of the problem from simple bulk insertion to combining a batch of rows into a table with existing data.
Destination table
CREATE TABLE [dbo].[MyTable]( [DeviceId] [int] NOT NULL, [LogDate] [datetime] NOT NULL, [LogType] [int] NOT NULL, [LogText] [nvarchar](50) NOT NULL, CONSTRAINT [PK_MyTable] PRIMARY KEY CLUSTERED ( [DeviceId] ASC, [LogDate] ASC, [LogType] ASC ))
Create a custom table type
CREATE TYPE [dbo].[MyTableType] AS TABLE( [DeviceId] [int] NOT NULL, [LogDate] [datetime] NOT NULL, [LogType] [int] NOT NULL, [LogText] [nvarchar](50) NOT NULL, PRIMARY KEY CLUSTERED ( [DeviceId] ASC, [LogDate] ASC, [LogType] ASC ))
Check and measure whether specifying PRIMARY KEY for TYPE make the overall process faster or slower.
Saved Procedure with TVP
CREATE PROCEDURE [dbo].[MergeMyTable] @ParamRows dbo.MyTableType READONLY AS BEGIN -- SET NOCOUNT ON added to prevent extra result sets from -- interfering with SELECT statements. SET NOCOUNT ON; BEGIN TRANSACTION; BEGIN TRY MERGE INTO dbo.MyTable as Dest USING ( SELECT TT.[DeviceId], TT.[LogDate], TT.[LogType], TT.[LogText] FROM @ParamRows AS TT ) AS Src ON (Dest.[DeviceId] = Src.[DeviceId]) AND (Dest.[LogDate] = Src.[LogDate]) AND (Dest.[LogType] = Src.[LogType]) WHEN MATCHED THEN UPDATE SET Dest.[LogText] = Src.[LogText] WHEN NOT MATCHED BY TARGET THEN INSERT ([DeviceId] ,[LogDate] ,[LogType] ,[LogText]) VALUES (Src.[DeviceId], Src.[LogDate], Src.[LogType], Src.[LogText]); COMMIT TRANSACTION; END TRY BEGIN CATCH ROLLBACK TRANSACTION; END CATCH; END
Call this stored procedure by passing it a batch of lines to merge. Test and measure how performance changes with batch size. Try lots with 1K, 10K, 100K lines.
If you never want to update existing rows with new values, delete the WHEN MATCHED THEN MERGE , it will work faster.