SQL Bulk copy with trigger in ASP.NET

SQL Bulk copy with trigger in ASP.NET

Posted by: Hamid Narikkoden
Category :

I recently built a new website and started getting startling responses from customers, which exceeded all my expectations. As a startup website, the incredible customer inflow was totally overwhelming. My customer base kept expanding, and everything worked fantastic until – the “Delay” possessed my till-then awesome website! Gradually, I noticed my website degrading in performance and response time, along with an increasing customer base. I had no other option except for performing an exorcism – completely refactoring its code.

What the Exorcism revealed:

I used ASP.NET MVC4 along with Entity framework 5 and SQL 2012 for my website. I followed LINQ query expressions and looping through the entire user base to deliver specific content to millions of users. The website seemed to take years to respond because of this and its bulky coding – my website was behaving truly possessed. It took ages to load; exhibited performance degrades and gave a terrible experience.

LINQ query consumed much time owing to its syntax validation and conversion of LINQ queries to SQL queries. I thought executing stored procedure in my database wasn’t a wise decision either, as it lacks debugging of values. All I wanted was a simple and fast operation.

How to make my website faster? – The question puzzled me like crazy!  This aroused in me the idea to implement SQLBulkCopy method. The method can be used to insert large amount of data very efficiently. Let me illustrate the scenario.

I have two tables: Profile and Content


After inserting to Content table I have to update the delivery date of the corresponding user profile.

string conn = ConfigurationManager.ConnectionStrings["bulkcopy"].ToString();
SQLConnection con = new SQLConnection(conn);

DataTable dt = new DataTable();
dt.Columns.Add(new DataColumn() { ColumnName = "Id", AutoIncrement = true });
dt.Columns.Add new DataColumn() { ColumnName = "Title" });
dt.Columns.Add new DataColumn() { ColumnName = "Text" });
dt.Columns.Add new DataColumn() { ColumnName = "UserId" });

SQLDataReader rd;
//Loop through and create rows of data and add to datatable
dr = dt.NewRow();
dr["Title"] = Title
dr["Text"] = Text
dr["UserId"] = UserId

SQLBulkCopy copy = new SQLBulkCopy(conn,SQLBulkCopyOptions.FireTriggers);
copy.DestinationTableName = "Content";


Suppose it may be around millions of rows but I need profile update on delivery date column after each insertion on Content table. So it should be fare if I go for an insert trigger in database. But as we are using SQLBulkCopy insert so just an update on the profile table would update only the first row in the profile. So I have to go for a cursor inside the trigger.

Create AFTER INSERT trigger:

Create trigger [updatedeliverydate] on [dbo].[Profile]
AFTER Insert

declare @UserId int;
Declare @cur Cursor
set @cur=cursor for select i.UserId from inserted i

open @cur
FETCH NEXT FROM @cur into @UserId


update Profile
set LastDeliveryDate=GETUTCDATE() where UserId=@UserId
FETCH NEXT FROM @cur INTO @UserId  --fetch next record

Thus, I could free my website from the “delay” spirit! The SQLBulkCopy helped make my query super-fast and it worked fabulous thereafter.

Comments (5)
Chirag Patel (6 months ago)

Great! HI Its work for me, Thanks

Ujjwala Datta (2 years ago)

Yes, this code is helpful, but i have a requirement and it is, i wanted to truncate Destination Table itself (Content) before Uploading data. for that i had used instead of trigger which is not working. can you specify better solution for this,

Hamid Narikkoden (2 years ago)

@Karthi: @Arun: Thanks for your feedback Dapper , A simple micro ORM supports bulk data insertion. It just expand system.Data.IDbConnection interface and provides typed output. A sample query can be as follows:

using (IDbConnection connection = new SqlConnection(constring))
                connection.Execute(@"Insert into details values(@Name,@Salary,@Basic_Pay)", detaillist());

We can specify a collection of model in the insert query and on execution ,the data get inserted within a fraction of second(Approx for 10000 entries, it consumed 5 seconds)

Arun V B (2 years ago)

Good read.. well written.

Leave a Comment

Your email address will not be published.