Hello,
I am working on a web app and am at a point where I have multiple rows in my GUI that need to be sent to and saved in SQL Server when the user presses Save. We want to pass the rows to a working table and the do a begin tran, move rows from working table to permanent tables with set processing, commit tran. The debate we are having is how to get the data to the work table. We can do individual inserts to the work table 1 round trip for each row (could be 100's of rows) or concatenate all rows into 1 long (up to 8K at a time) string and make one call sending the long string and then parse it into the work table in SQL Server. Trying to consider network usage and overhead by sending many short items vs 1 long item and cpu overhead for many inserts vs string manipulation and parsing. Suggestions?
Thanks you
JeffHow about dump to Text File then Use Stored procedure to import then move files out of the way when complete.
May even remove the need for a work table cos U will already have a log & using DTS or BCP to import is pretty fast
just a thought
GW|||Yeah I was thinking that's what I would do...
But what happens when, let's say you have 10 users trying to do this at the same time...
Your work table seems problematic
Just dump the file...and use a sproc to bcp it in...
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment