blob: bb37867f732b5871bd343ee7b26e6d91a041c6cb [file] [log] [blame]
<!doctype html public "-//w3c//dtd html 4.0 transitional//en">
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
<link rel="stylesheet" href="" type="text/css">
<title>WTP automated testing</title>
<td ALIGN=LEFT width="80%">
<p><b><font face="Verdana" size="+3">WTP automated testing</font></b><br>
Last Updated: Nov. 29, 2004</p>
<p>This document outlines the principles of WTP automated testing. It also provides some guidelines
for creating and running test cases. If anyone has suggestions for this document,
please post a message to the wtp-dev mailing list.</p>
<table border=0 cellspacing=5 cellpadding=2 width="100%" >
<td align=LEFT valign=TOP colspan="3" bgcolor="#0080C0"> <b><font face="Arial,Helvetica" color="#FFFFFF">Testing</font></b></td>
<TD align="LEFT" valign="TOP">
<p>The goal of unit testing is to achieve:
<li>Continuous integration. Unit tests are run as part of the WTP builds. They give an early indication on
what is failing. Unit test failures should be fixed as soon as possible (before the next integration build).</li>
<li>API compatibility. Component teams should provide unit tests for their public APIs. This ensures developers
do not break existing clients.</li>
<P>The goal of performance testing is to achieve uniform or better performance as WTP moves
forwards. As a developer, your dedication to WTP performance is strongly desired. To ensure the
performance of WTP does not regress over time, developers should provide performance test cases
along side with their features, etc. Developers are also expected to verify their
bug fixes and features contribution against existing performance test cases. If something is not performing
well, open a bug. Use performance as the keyword. Click <a href="">here</a> to see a list of all the
currently open performance bugs in WTP.</P>
<p>Eclipse has a <a href="">performance infrastructure</a> in place for measuring and tracking
performance. The performance processes described in this document are modeled around the same
infrastructure. To create and run tests under this infrastructure, please refer to the
<a href="">Eclipse Tests How-To document</a>.
Eclipse also has tips and tools to aid developers to debug
and track down performance problems. They are listed here:
<li><a href="">Performance bloopers</a></li>
<li><a href="">Core tools</a></li>
<li><a href="">SWT tools</a></li>
<p>Components may have their own testing requirements. For example,
the server tools framework often call APIs on server extensions to retrieve data about the
extension. Some of these APIs must be short running as they are called from the UI. The server
tools framework provides abstract performance test cases that extensions should extend to verify
that code contributed by the extension does not regress performance in the base framework.
Performance requirements from component teams are listed in a document which is located in that
component’s “development” directory in CVS. Please refer to the <a href="">WTP Development Practices</a>
document regarding any development related issues.
<table border=0 cellspacing=5 cellpadding=2 width="100%" >
<td align=LEFT valign=TOP colspan="3" bgcolor="#0080C0"> <b><font face="Arial,Helvetica" color="#FFFFFF"><a name="policy"></a>Creating junit test cases</font></b></td>
<td align="LEFT" valign="TOP">
<p>Here's a laundry list for integrating test plugins into the WTP build:
<li>Commit the plugin into CVS, use the component's "performance-tests" folder. For example,
the org.eclipse.wst.wsdl.tests.performance plugin should be placed into
<li>Add the plugin to the component's tests map file. This map file can be found inside
the /home/webtools/org.eclipse.wtp.releng/maps directory.</li>
<li>Add the plugin to the feature.xml file. This file can be found inside the
/home/webtools/&lt;sub-project&gt;/assembly/features/&lt;performance feature&gt; directory.</li>
<li>Update and test.xml inside /home/webtools/org.eclipse.wtp.releng/testScripts to include
the new performance plugins.</li>
The <a href="">Eclipse Tests How-To document</a>
has a very thorough explaination on how to create and run performance test cases using the Eclipse
performance infrastructure.
<table border=0 cellspacing=5 cellpadding=2 width="100%" >
<td align=LEFT valign=TOP colspan="3" bgcolor="#0080C0"> <b><font face="Arial,Helvetica" color="#FFFFFF">WTP performance process</font></b></td>
<td align="LEFT" valign="TOP">
<p>This section describes the process for tracking performance in WTP. It is based on the
process used by the Eclipse Platform Project. All performance tests must be automated.
Performance tests are run every week using Thursday's
integration builds. Performance tests should:
<li>never have compile errors</li>
<li>always run to completion</li>
<p>If either condition fails, failures should be handled immediately according to the
<a href="">WTP Development Practices document</a>.
Performance results are store in a Cloudscape database and are compared against results from
the previous release. In case of a regression, a note will be posted to the mailing list
indicating the problem.
<li>the developer who introduced the regression should fix the performance problem</li>
<li>if the regression can be justified by a new feature, then the PMC must get involved
and decide how important that feature is (ex. any competition, etc). Solutions may
include, but not limited to, making the feature optional (ex. create a preference and turn
it off by default), so only users who wish to use it have to pay for it.</li>
<p>Performance results from the weekly integration build are rendered into a graph, and is
linked to on the build page. This graph provides a simple comparison between the integration
build and the reference build.
<p>To run performance tests for a build that's available from the Eclipse download Web site:
<li>Check out /home/webtools/org.eclipse.wtp.releng</li>
<li>Change the properties files to fit your system (,, build.cfg and etc)</li>
<li>Open a command prompt and navigate to the org.eclipse.wtp.releng directory</li>
<li>Run the following command:
<p>ant -f cruise.xml -DbuildType=&lt;buildType&gt; -DbuildId=&lt;buildId&gt; -Dtimestamp=&lt;timestamp&gt; performance</p>
<p>For example:</p>
<p>ant -f cruise.xml -DbuildType=N -DbuildId=N20041127 -Dtimestamp=200411271458 performance</p>
Running performance tests for a local build is similar. Go to your ${buildDirectory} directory and check the
buildType, buildId and timestamp for your local build. Go through the same steps as above. If you have a Cloudscape
database setup (refer to the <a href="">Eclipse Tests How-To document</a>),
then the performance results will be written to the ${testDir}/results directory, else the performance results
will be displayed in the console.