Editorials

Data Merging

Merging data change has been a common practice for a long time. Microsoft even created a full on Merge command allowing you to perform Insert/Update/ Delete data changes in a single command.

What is interesting to me as a consultant is how often I come across web applications that have no concept of merging data. Applications are written passing sets of data as XML documents to stored procedures where the data is parsed out. Then all of the existing data is removed physically or virtually, and the new data written in its place.

This method has a few shortcomings I’d like to point out:

  • There is no audit trail of change. Simply removing data and adding new you don’t know what was modified
  • If you use a soft delete (have a flag for a record as deleted rather than physically deleting the data) on a highly active system you data consumption is unreasonable
  • What about multi-user access? If you delete data and another user modifies the same data, they simply drop your updates
  • Performance is reduced because of data volume, index fragmentation, and just CPU overhead parsing XML documents


Sending data as sets makes sense to me. Use temporary tables or User Defined Table Types instead. Get the data over to the server and simply perform the necessary merge on the tables.

Do you think I’m being too critical here? Share your thoughts in the comments or drop an email to BTaylor@sswug.org.

Cheers,

Ben