Editorials

XML Abused

XML Abused
Maleable software is the ultimate, yet unreachable goal for software solutions. We want to write software that is efficient and works according to requirements, yet accommodates change readily. Computers don’t do that well. Computer software doesn’t do it much better.

One key technique to making software more extensible is to allow the software to be able to change in small pieces without impacting that larger software system. We do this by isolating processes for specific duties or responsibilities. In order to get the entire system to work the different processes must have some technique for communication. In the late 1990s and early 2000 SOAP protocol for accessing remote modules was developed, and XML was the technique used to communicate data between systems.

XML was chosen because it was not system specific. It wasn’t limited to a 16 bit, 32 bit or larger address space. It supported complex data types including sets. It supported sets within sets of data. It was data that, using an XSD, was defined but could be transported in an ubiquitous way that may be supported by any computer capable of communication.

The intention was not to make XML data a grab bag capable of holding data of any form that could even change from one usage to the next. It was still well formed, well typed data, using a generic data typing that could be converted to native types on many dissimilar machines.

It appears to me that much of the usage of XML has been abused and lost some of the value of its original intention. Today, XML is often abused much like the old VB Variant data type. No XSD is considered. The data is never thought out…it is simply put into XML because XML can hold anything without thought or reason. Rather than understanding the shape of the data we discover it by traversing through the tags inside the document. The contents is discovered rather than declared.

I worked on a system that utilized XML as the input for stored procedures. Every stored procedure had a single input parameter consisting of an XML fragment. The XML was parsed into local SQL Variables inside the SP. This simplified the system because they simply rolled in new variables without changing the signature. Multiple versions could be handled simultaneously because the procedure was responsible for parsing the variant data into meaningful input parameters. Please don’t go taking this as a suggestion for a good architecture. There are many wicked results of this strategy.

I’ll pick up more on this topic tomorrow. In the meantime, do you find XML is abused in software you work with? Has it become the replacement of the accursed VB Variant data type? Get into the conversation by sending your email to btaylor@sswug.org.

Cheers,

Ben

$$SWYNK$$

Featured Article(s)
Database Design: From Logical to Physical
A logical data model should be used as the blueprint for designing and creating a physical database. But the physical database cannot be created properly with a simple logical to physical mapping. Many physical design decisions need to be made by the DBA before implementing physical database structures. This may necessitate deviating from the logical data model. But such deviation should occur only based on in-depth knowledge of the DBMS and the physical environment in which the database will exist.

Featured White Paper(s)
Go Beyond SCOM Monitoring with Foglight for SQL Server
read more)

Featured Script
View input buffers for all SPIDs
This procedure, sp_AllInputBuffers, uses dynamic T-SQL to generate and execute the DBCC INPUTBUFFER statement for each server… (read more)