Download (direct link):
Before management commits us to a new schedule involving XML, we decide to do some quick performance analysis to see what kind of price we are paying for these extra steps. It may be that the prototype, while functional, doesn't represent a design that will work under load.
Let's take a look at the prototype's structure so we can get a better sense of what we are about to test. The new interface reuses the application's model code, so there's nothing new there. The changes begin to happen at the JSP level. Instead of spitting out HTML, the JSPs generate XML elements using JDOM (www.jdom.org). The XML generation looks like this (from productlist.jsp):
Element eProduct = new Element("CURRENT_PRODUCT");
Element id = new Element("ID");
Element name = new Element("NAME");
Element description = new Element("DESCRIPTION"); description.addContent(product.getDescription()); eProduct.addContent(description);
Element ePrice = new Element("PRICE"); ePrice.addContent(price); eProduct.addContent(ePrice);
This page looks like a good candidate for refactoring. Perhaps if we pursue this architecture, we can write use some object-to-XML mapping tools. Also, this code probably should execute in a separate class so that we can test it more easily. We take down these ideas as notes for later. The XML that this page produces looks something like the following:
After all, the XML has been modeled in JDOM objects; the JSP writes the XML out to a String and then pass it to Xalan's XSLT processing engine. We note this operation as a potential performance bottleneck. JDOM may already represent its object in a form that an XSLT engine can read—reading from and writing to Strings could be unnecessary. The XSLT engine uses a style sheet specified by the JSP to transform XML output of the page into HTML. Here is the XSL for the product element:
<table width="100%" cellpadding="0" cellspacing-^" border="0">
<xsl:value-of select="CURRENT_PRODUCT/NAME" />
<xsl:value-of select="CURRENT_PRODUCT/DESCRIPTION" />
<xsl:value-of select="CURRENT_PRODUCT/PRICE" />
The end result is a page that mimics the page created by the regular pet store application (we could even use HttpUnit to verify that they are structurally identical). However, with some refactoring, XML data generated by the page could be transformed in a number of different ways, or even sent to partners without transformation. If those are pressing business needs, maybe this will be a viable architecture. We have to weigh these potential advantages (and XP says we should treat potential advantages with suspicion) against the results of our JMeter testing in order to give management good feedback about the decisions they make.
Creating the Test
We decided to compare the XSL prototype's performance against that of the existing system. By taking this as our goal, we simplify our task. Real-world performance conditions can be hard to replicate, making absolute performance difficult to measure. By testing two alternatives side by side, we develop an idea of their relative worth. When deciding between the two options, that's all the data we need.
The Test Plan
We decide to subject both Web applications to the same series of tests. Several simulated users will access a series of pages over and over again. While the system is under this load, we will gather its performance metrics. We will test each application four times: with 10, 30, 100, and 500 users. We know that the AAA Pet Store gets an average of 10 to 30 concurrent users during the day, but the customer worries about the potential increase in use connected to their upcoming promotional campaign.
We model the test case after typical user behavior: We enter at the index page, go to a pet category, and then view a couple of animals within the category. This test exercises every major page in the prototype—another
excellent feature. We note a couple of complicating factors: The Web server, the database, and the testing application (JMeter) are collocated. This fact eliminates one type of test noise (data transfer over an open network) but generates another: If the load becomes too high, the three applications might begin to compete with one another for scarce resources, increasing test times artificially. We decide to accept this risk, especially in light of the fact that Distributed JMeter (http://jakarta.apache.org/jmeter/user manual/rmi.html) was developed precisely to eliminate uncertain network bottlenecks. Also, we are testing two applications side by side, and any box issues should affect both equally. Still, we minimize our risk by reducing the number of extraneous processes running on the test box.