| <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"> |
| <html> |
| <head> |
| <meta http-equiv=Content-Type content="text/html; charset=ISO-8859-1"> |
| <title>Performance Tests HowTo</title> |
| <link rel="stylesheet" href="http://dev.eclipse.org/default_style.css" type="text/css"> |
| </head> |
| <body> |
| |
| <h1>Performance Tests HowTo</h1> |
| <p> |
| By Sonia Dimitrov, Christof Marti, Andre Weinand<br> |
| 2005/02/24<br> |
| |
| <p> |
| The Eclipse performance test plugin (org.eclipse.test.performance) |
| provides infrastructure for instrumenting programs |
| to collect performance data and to assert that performance doesn't drop |
| below a baseline. The infrastructure is supported on Windows, Linux, |
| and MacOS X. |
| |
| <p> |
| The first part of this document describes how performance tests are |
| written and executed, the <a href="#cloudscape">second part</a> explains how performance data is |
| collected in a database and how this database is installed and |
| configured. |
| |
| <h2>Writing Performance Tests</h2> |
| |
| <h3>Setting up the environment</h3> |
| |
| <ul> |
| <li>check out the following plug-ins from dev.eclipse.org.</li> |
| <ul> |
| <li>org.eclipse.test.performance</li> |
| <li>org.eclipse.test.performance.win32 (for Windows only)</li> |
| </ul> |
| <li>you need org.junit</li> |
| <li>add org.eclipse.test.performance to your test plug-in's dependencies</li> |
| </ul> |
| |
| <h3>Writing a performance test case</h3> |
| |
| A performance test case is an ordinary JUnit test TestCase. |
| <ul> |
| <li>create a test case with test methods along the lines of: |
| <pre> |
| public void testMyOperation() { |
| Performance perf= Performance.getDefault(); |
| PerformanceMeter performanceMeter= perf.<b>createPerformanceMeter</b>(perf.getDefaultScenarioId(this)); |
| try { |
| for (int i= 0; i < 10; i++) { |
| performanceMeter.<b>start</b>(); |
| toMeasure(); |
| performanceMeter.<b>stop</b>(); |
| } |
| performanceMeter.<b>commit</b>(); |
| perf.<b>assertPerformance</b>(performanceMeter); |
| } finally { |
| performanceMeter.<b>dispose</b>(); |
| } |
| } |
| </pre> |
| </li> |
| |
| <li>or create a test case extending <code>PerformanceTestCase</code> |
| which is a convenience class that makes the |
| use of <code>PerformanceMeter</code> transparent: |
| <pre> |
| public class MyPerformanceTestCase extends PeformanceTestCase { |
| |
| public void testMyOperation() { |
| for (int i= 0; i < 10; i++) { |
| <b>startMeasuring</b>(); |
| toMeasure(); |
| <b>stopMeasuring</b>(); |
| } |
| <b>commitMeasurements</b>(); |
| <b>assertPerformance</b>(); |
| } |
| } |
| </pre> |
| </li> |
| </ul> |
| |
| Notes: |
| <ul> |
| <li> |
| The scenario id passed to <code>createPerformanceMeter(...)</code> must be unique in a single test run |
| and must be the same for each build. This enables comparisons between builds. |
| The <code>Peformance#getDefaultScenarioId(...)</code> methods are provided for convenience. |
| </li> |
| |
| <li> |
| <code>PerformanceMeter</code> supports repeated measurements by multiple invocations |
| of the <code>start()</code>, <code>stop()</code> sequence. The call |
| to <code>commit()</code> is required before evaluation with <code>assertPerformance()</code> |
| and <code>dispose()</code> is required before releasing the meter. |
| </li> |
| |
| <li>The first iterations of the above <code>for</code>-loop will generally take more time because the |
| code is not optimized by the JIT compiler yet. This can introduce some variance to the |
| measurements, especially if other tests run before and change in some way that affects |
| the JIT's optimization of the measured code. A simple way to stabilize the measurements |
| is to run the code a few times before the measurements start. Caches also need special |
| caution as they can affect the measurements. |
| </li> |
| |
| <li> |
| As a rule of thumb the measured code should take at least 100ms on the target machine in |
| order for the measurements to be relevant. For example, Windows' and Linux 2.4's system time increases |
| in 10ms steps. In some cases the measured code can be invoked repeatedly to accumulate the |
| elapsed time, however, it should be kept in mind that JIT could optimize this more aggressively |
| than in real-world scenarios. |
| </li> |
| </ul> |
| |
| <h3>Participating in the performance summary (aka "Performance Fingerprint")</h3> |
| |
| If the number of performance tests grows large, it becomes harder to |
| get a good overview of the performance characteristics of a build. A |
| solution for this problem is a performance summary chart that |
| tries to condense a small subset of key performance tests into a chart |
| that fits onto a single page. Currently the performance infrastructure |
| supports two levels of summaries, one global and any number of "local" summaries. |
| A local summary is typically associated with a component. |
| <p> |
| A summary bar chart shows the performance development of about 20 tests relative to a |
| reference build in an easy to grasp red/green presentation. |
| <p> |
| <img alt="summary graph" src="FP_I200409240800_I200410050800.jpeg"> |
| <p> |
| So dependent on the total number of components every Eclipse component can tag one or |
| two tests for inclusion in a global and up to 20 for a local performance summary. |
| Tests marked for the global summary are automatically included for a local summary. |
| <p> |
| Marking a test for inclusion is done by passing a performance meter |
| into the method Performance.tagAsGlobalSummary(...) or Performance.tagAsSummary(...). |
| Both methods should |
| be called outside of start/stop calls but it must be called before the |
| the call to commit(). |
| <pre> |
| // .... |
| Performance perf= Performance.getDefault(); |
| PerformanceMeter pm= perf.createPerformanceMeter(perf.getDefaultScenarioId(this)); |
| perf.<b>tagAsGlobalSummary</b>(pm, "A Short Name", Dimension.CPU_TIME); |
| try { |
| // ... |
| </pre> |
| <p> |
| In order to keep the overview graph small, only a single dimension (CPU_TIME, USED_JAVA_HEAP etc.) of |
| the test's data is shown and only a short name is used to label the |
| data (instead of the rather long scenario ID). Both the short label as |
| well as the dimension must be supplied in the calls to tagAsGlobalSummary and tagAsSummary. |
| The available dimensions can be found in <code>org.eclipse.test.performance.Dimension</code>. |
| |
| <p> |
| The PerformanceTestCase provides similar methods that must be called |
| before startMeasuring(): |
| <pre> |
| public class MyPerformanceTestCase extends PerformanceTestCase { |
| |
| public void testMyOperation() { |
| <b>tagAsSummary</b>("A Short Name", Dimension.CPU_TIME); |
| for (int i= 0; i < 10; i++) { |
| startMeasuring(); |
| toMeasure(); |
| stopMeasuring(); |
| } |
| commitMeasurements(); |
| assertPerformance(); |
| } |
| }</pre> |
| |
| <h3>Running a performance test case (from a launch configuration)</h3> |
| |
| <ul> |
| <li>create a new JUnit Plug-in Test launch configuration for the test case</li> |
| <li>add "-Xms256M -Xmx256M" or similar to the VM arguments to avoid memory pressure during the measurements</li> |
| <li>run the launch configuration</li> |
| <li>by default the measured averages of the performance monitor |
| are written to the console on <code>commit()</code>. This is |
| surpressed if performance tests are configured to store data in the |
| database (see below). |
| </li> |
| </ul> |
| |
| <h3>Running a performance test case (from the command-line)</h3> |
| |
| <ul> |
| <li>this method is of particular interest, if you want to do precise measurements of a specific build</li> |
| <li>export the following plug-ins as 'Deployable plug-ins' to the eclipse installation directory (not the 'plugins' directory; choose to deploy as |
| 'a directory structure')</li> |
| <ul> |
| <li>org.eclipse.test</li> |
| <li>org.eclipse.test.performance</li> |
| <li>org.eclipse.test.performance.win32 (for Windows only)</li> |
| <li>your test plug-in(s)</li> |
| </ul> |
| <li>copy the following .bat file and customize it to your needs (Windows):</li> |
| <pre> |
| @echo off |
| |
| REM Installation paths |
| SET ECLIPSEPATH=c:\eclipse |
| SET JVMPATH=c:\jdk\jdk1.4.2_05 |
| |
| REM Paths, relative to ECLIPSEPATH |
| SET BUILD=I200411050810 |
| SET WORKSPACE=workspace\performance |
| |
| REM Test |
| SET TESTPLUGIN=org.eclipse.jdt.text.tests |
| SET TESTCLASS=org.eclipse.jdt.text.tests.performance.OpenQuickOutlineTest |
| |
| REM For headless tests use: org.eclipse.test.coretestapplication |
| SET APPLICATION=org.eclipse.test.uitestapplication |
| |
| REM Add -clean when the installation changes |
| SET OPTIONS=-console -consolelog -showlocation |
| SET JVMOPTIONS=-Xms256M -Xmx256M |
| |
| ECHO Build: %ECLIPSEPATH%\%BUILD% |
| ECHO Workspace: %ECLIPSEPATH%\%WORKSPACE% |
| ECHO Test: %TESTPLUGIN%\%TESTCLASS% |
| |
| %JVMPATH%\bin\java %JVMOPTIONS% -cp %ECLIPSEPATH%\%BUILD%\startup.jar org.eclipse.core.launcher.Main %OPTIONS% -application %APPLICATION% -data %ECLIPSEPATH%\%WORKSPACE% -testPluginName %TESTPLUGIN% -className %TESTCLASS% |
| |
| pause |
| </pre> |
| <li>after testing the setup you may want to close other applications to avoid distortion of the measurements</li> |
| </ul> |
| |
| <h3>Running a performance test case (within the Automated Testing Framework on each build)</h3> |
| |
| If the <code>test.xml</code> of your test plug-in already exists |
| and looks similar to the <code>jdt.text.tests</code>' one, add targets |
| similar to those shown below. The <code>performance</code> target is |
| the entry point for performance testing like the <code>run</code> target |
| is for correctness testing. |
| <pre> |
| <!-- This target defines the performance tests that need to be run. --> |
| <target name="performance-suite"> |
| <property name="your-performance-folder" value="${eclipse-home}/your_performance_folder"/> |
| <delete dir="${your-performance-folder}" quiet="true"/> |
| <ant target="ui-test" antfile="${library-file}" dir="${eclipse-home}"> |
| <property name="data-dir" value="${your-performance-folder}"/> |
| <property name="plugin-name" value="${plugin-name}"/> |
| <property name="classname" value="<em><your fully qualified test case class name></em>"/> |
| </ant> |
| </target> |
| |
| <!-- This target runs the performance test suite. Any actions that need to happen --> |
| <!-- after all the tests have been run should go here. --> |
| <target name="performance" depends="init,performance-suite,cleanup"> |
| <ant target="collect" antfile="${library-file}" dir="${eclipse-home}"> |
| <property name="includes" value="org*.xml"/> |
| <property name="output-file" value="${plugin-name}.xml"/> |
| </ant> |
| </target> |
| </pre> |
| |
| Notes: |
| <ul> |
| <li> |
| Performance measurements are run on a dedicated performance measurement machine of the Releng team. |
| </li> |
| |
| <li> |
| If you have created a new source folder, do not forget to include |
| it in the build (add it to the "build.properties" file). |
| </li> |
| </ul> |
| |
| <h3>Running a performance test case (within the Automated Testing Framework, locally)</h3> |
| |
| <ul> |
| <li> |
| modify <code>test.xml</code> as described above |
| </li> |
| |
| <li> |
| download a build and its accompanying test framework plug-ins contained |
| in eclipse-test-framework-*.zip |
| </li> |
| |
| <li> |
| unzip the Eclipse SDK and the eclipse-test-framework zip to install your target Eclipse<br> |
| (you need <a href="http://www.info-zip.org/pub/infozip/UnZip.html">Info-ZIP |
| UnZip</a> version 5.41 or later (<a href="ftp://sunsite.cnlab-switch.ch/mirror/infozip/WIN32/unz551xN.exe">Windows</a>) |
| installed and added to the path). |
| </li> |
| |
| <li> |
| export your test plug-in as a 'Deployable plug-in' to the target Eclipse installed |
| above; choose to export as directory structure. |
| </li> |
| |
| <li> |
| open a terminal or command window and execute the follwoing (on a single line):<br> |
| <pre> |
| java -jar <i><b><an eclipse install></i></b>/startup.jar |
| -application org.eclipse.ant.core.antRunner |
| -file <i><b><target eclipse install/plugins/your test plug-in id_version></b></i>/test.xml |
| performance |
| -Dos=<i><b><os></b></i> -Dws=<i><b><ws></b></i> -Darch=<i><b><arch></b></i> -Declipse_home=<i><b><target eclipse install></b></i> |
| "-Dvmargs=-Xms256M -Xmx256M" |
| -logger org.apache.tools.ant.DefaultLogger |
| </pre> |
| </li> |
| |
| <li> |
| The JUnit results are written to an xml file in the root of the target Eclipse |
| and the performance measurements are written to the console. |
| </li> |
| |
| </ul> |
| |
| <h2><a name="cloudscape"></a>Setting up the Derby database</h2> |
| |
| Performance tests are only valuable if measured data can be |
| monitored over time and compared against reference data. |
| For this functionality the Eclipse performance plugin makes use of |
| the Apache project's <a href="http://incubator.apache.org/derby/">Derby</a> |
| database (formerly called |
| <a href="http://www-306.ibm.com/software/data/cloudscape/">Cloudscape</a>). |
| </p> |
| <p>Derby is a database engine written in Java that can be accessed |
| via JDBC. Derby is easily embeddable in Java programs or can run |
| as a network server. |
| </p> |
| <p>This section describes how to install Derby and how to |
| configure the performance test plugin to use Derby. |
| |
| <h3>Getting and installing Derby</h3> |
| |
| The performance infrastructure does not include Derby. |
| If you want to leverage Derby |
| you need to download and install it. |
| |
| <p> |
| The performance plugin has an optional prereq for a "org.apche.derby" |
| library project. Since it is optional, you won't see any compile time |
| errors when loading the performance plugin from the Eclipse repository |
| and the Derby project is not available in your workspace. However |
| you'll see runtime errors when running the tests and trying to access |
| the database.<br> |
| |
| <p> |
| If you have access to the following repository you can get the |
| org.apache.derby library project from there: |
| <pre> |
| :pserver:anonymous@ottcvs1.ott.oti.com:/home/cvs/zrheclipse |
| </pre> |
| Otherwise get Derby from |
| <a href="http://www-106.ibm.com/developerworks/db2/library/techarticle/dm-0408cline/index.html">here</a>. |
| Unpack the archive to any directory. |
| <br> |
| To create a library project for Derby, open the Java project |
| wizard and enter "org.eclipse.derby" as the project's name. Go to the next |
| page and select the "Libraries" tab. Remove the JRE and add the five |
| jar-files () from Derby's lib directory via the "Add External JARs" |
| button. Switch to the "Order and Export" tab and check all five |
| libraries. Press "Finish". Create a new file "plugin.xml" inside the |
| Derby project and paste the following contents into it:<br> |
| <pre> |
| <?xml version="1.0" encoding="UTF-8"?> |
| <?eclipse version="3.0"?> |
| <plugin |
| id="org.apache.derby" |
| name="Derby" |
| version="1.0.0"> |
| <runtime> |
| <library name="db2jcc.jar"> |
| <export name="*"/> |
| </library> |
| <library name="db2jcc_license_c.jar"> |
| <export name="*"/> |
| </library> |
| <library name="derby.jar"> |
| <export name="*"/> |
| </library> |
| <library name="derbynet.jar"> |
| <export name="*"/> |
| </library> |
| <library name="derbytools.jar"> |
| <export name="*"/> |
| </library> |
| </runtime> |
| </plugin> |
| </pre> |
| |
| In addition you'll need to load the performance plugin |
| (org.eclipse.test.performance) and if you are running |
| on Windows the associated fragment (org.eclipse.test.performance.win32). |
| |
| |
| <h3>Configuring the performance plugin for using Derby</h3> |
| |
| The performance test plugin is configured via the three Java properties |
| eclipse.perf.dbloc, eclipse.perf.config, and eclipse.perf.assertAgainst. |
| <p> |
| The eclipse.perf.dbloc specifies where the Derby DB is located. |
| If no value is given |
| <pre> |
| -Declipse.perf.dbloc= |
| </pre> |
| Derby runs in embedded mode (not as a separate server) |
| and the DB will live in your home directory. |
| <p> |
| If an absolute or relative path is given, Derby uses or creates |
| the DB in that location. E.g. with (Linux and MacOS X) |
| <pre> |
| -Declipse.perf.dbloc=/tmp/derby |
| </pre> |
| Derby runs in embedded mode and creates the database under |
| /tmp/derby. |
| <p> |
| To connect to a Derby server running locally (or remotely) use the |
| following: |
| <pre> |
| -Declipse.perf.dbloc=net://<i>tcp-ip address</i> |
| </pre> |
| With the properties eclipse.perf.config and eclipse.perf.assertAgainst |
| you specify the name under which performance data is stored in the |
| database and the name of the reference data to compare against. This |
| "name" is not a single string but a set of key/value pairs separated by |
| semicolons: |
| <pre> |
| -Declipse.perf.config=<i>key1</i>=<i>value1</i>;<i>key2</i>=<i>value2</i>;...;<i>keyn</i>=<i>valuen</i> |
| -Declipse.perf.assertAgainst=<i>key1</i>=<i>value1</i>;<i>key2</i>=<i>value2</i>;...;<i>keyn</i>=<i>valuen</i> |
| </pre> |
| The key/value pairs can be used to associate the collected |
| performance data with information about the configuration that was used |
| to generate the data. Typically this includes the name of the build, |
| the system on which the test were run, or the used Java VM. So in this |
| example: |
| <pre> |
| -Declipse.perf.config=build=N20040914;host=relengwin;jvm=j9 |
| </pre> |
| performance data for the nightly build N20040914 is stored in the |
| database under a "name" that consist of three key/value pairs.<br> |
| If the tests are run multiple times with the same arguments, the new |
| data does not replace |
| old data but is added under the same name. Programs that visualize the |
| data are expected |
| to aggregate the data for example by calculating the average of all |
| tests. |
| |
| <p> |
| To assert that performance data collected for another build does not |
| degrade with respect to some reference data the assertAgainst property |
| is used similarly: |
| <pre> |
| -Declipse.perf.assertAgainst=build=R3.0;host=relengwin;jvm=j9 |
| </pre> |
| This property enables any "assertPerformance" calls in your performance |
| tests and compares the newly measured data against the data specified |
| by the three key/value pairs. Please note that the order of the pairs |
| does not matter when looking up the data in the database. However, the |
| number of key/value pairs must be identical. |
| <p> |
| Because in most cases you want to store newly collected data as well as |
| assert against other reference data at the same time you'll need to |
| specify both properties. In this case only those key/value pairs must |
| be listed in the assertAgainst property, that differ from the config |
| property: |
| <pre> |
| -Declipse.perf.config=build=N20040914;host=relengwin |
| -Declipse.perf.assertAgainst=build=R3.0 |
| </pre> |
| So in the example from above the new performance data is stored in the |
| database under the build name |
| "N20040914" and the host "relengwin" and the "assertPerformance" |
| compares this data against data tagged with a build name of "R3.0" and |
| an implicitely specified host "relengwin". |
| <p> |
| If you want to assert the average of multiple runs (instead of the data |
| of a single run) |
| against the reference data, do the following: |
| <pre> |
| // Run program 4 times to collect data under build name "I20040914" |
| ... -Declipse.perf.config=build=I20040914 |
| ... -Declipse.perf.config=build=I20040914 |
| ... -Declipse.perf.config=build=I20040914 |
| ... -Declipse.perf.config=build=I20040914 |
| |
| // Run program a 5th time and collect more data under I20040914 |
| // and assert the average of 5 runs of I20040914 against some baseline data |
| ... -Declipse.perf.config=build=I20040914 -Declipse.perf.assertAgainst=build=R3.0 |
| </pre> |
| |
| <h3>Viewing the data</h3> |
| |
| Since we do not (yet) have fancy visualization tools, the performance |
| test plugin provides a class <code>org.eclipse.test.internal.performance.db.View</code> |
| that can be run as a standalone program for viewing the data contained in the database in a tabular format. |
| <p> |
| You need to specify the database location via the <code>eclipse.perf.dbloc</code> |
| property (most easily done via a launch configuration). |
| Select the data to view by either specifying a variation via the <code>eclipse.perf.config</code> property or |
| by directly setting the key/value pairs of the variation at the beginning of the program's <code>main</code> method. |
| If you only want to view specific scenarios, use an appropriate pattern for the local variable <code>scenarioPattern</code>. |
| The local variable <code>seriesKey</code> specifies what variation is shown on the x-axis of the table. |
| <p> |
| So the following setup: |
| <pre> |
| public class View { |
| |
| public static void main(String[] args) { |
| |
| Variations <b>variations</b>= PerformanceTestPlugin.getVariations(); |
| variations.put("host", "relengwin"); |
| variations.put("build", "I%"); |
| |
| String <b>scenarioPattern</b>= "%RevertJavaEditorTest%"; |
| |
| String <b>seriesKey</b>= "build"; |
| |
| // ... |
| </pre> |
| creates a table showing all dimensions of the (single) scenario selected by the pattern |
| "<code>%testRevertJavaEditor%</code>" for all integration builds (that is builds starting with a capital 'I'). |
| <pre> |
| Scenario: org.eclipse.jdt.text.tests.performance.RevertJavaEditorTest#testRevertJavaEditor() |
| Builds: I200409240800 I200409281200 I200410050800 I200410190941 I200410260800 |
| CPU Time: 1.02 s [284 ms] 1.05 s [327 ms] 971 ms 1 s 481 ms |
| Committed: 69K [246K] 119K [389K] 103K 111K -97484 |
| Elapsed Process: 1.02 s [286 ms] 1.07 s [345 ms] 981 ms 1.01 s 481 ms |
| Kernel time: 41 ms [27 ms] 48 ms [40 ms] 46 ms 28 ms 22 ms |
| Page Faults: 145 [125] 148 [125] 176 191 143 |
| System Time: 1.02 s [285 ms] 1.06 s [345 ms] 981 ms 1.01 s 477 ms |
| </pre> |
| |
| If you are interested in creating performance charts and tables similar to those available on the eclipse platform |
| download pages, you could try the stand-alone java program org.eclipse.test.performance.ui.Main stored in the |
| org.eclipse.releng.basebuilder project. Refer to the <a href="http://dev.eclipse.org/viewcvs/index.cgi/%7Echeckout%7E/org.eclipse.releng.basebuilder/plugins/org.eclipse.test.performance.ui/readme.html?rev=HEAD&content-type=text/html">readme.html</a> in |
| org.eclipse.releng.basebuilder/plugins/org.eclipse.test.performance.ui for more details. |
| |
| <h3>How to setup a Derby server (on Linux and MacOS X)</h3> |
| |
| <ul> |
| |
| <li>Either get Derby from the repository or from <a |
| href="http://www-106.ibm.com/developerworks/db2/library/techarticle/dm-0408cline/index.html">here</a>. |
| </li> |
| |
| <li>Get the (Bourne) shell script "derby.sh" from the scripts folder of <code>org.eclipse.test.performance</code> |
| and install it on the server |
| (rename it to "derby" and make it executable; if you've checked out the |
| file on Windows and copied it to Linux, |
| it might be necessary to convert line delimiters with the dos2unix |
| tool). |
| The script simplifies the usage of the Derby tools - especially |
| starting and stopping the server - because |
| it sets the correct classpath and some important properties. |
| </li> |
| |
| <li>Edit the script and adapt the variables <code>CSLIB</code>, <code>DBROOT</code>, |
| and <code>JAVA</code> |
| to your installation. <code>CSLIB</code> should point to the directory |
| containing the Derby jars. |
| If you've used the Derby installer, then this is the lib directory |
| inside the Cloudscape 10.0 directory. |
| If you are using the org.apache.derby project from the repository, then this |
| is just the project folder. |
| </li> |
| |
| <li>in a Shell execute |
| <pre> |
| derby start & |
| </pre> |
| to launch the server in background. The server will send this to |
| stdout: |
| <pre> |
| Server is ready to accept connections on port 1527. |
| </pre> |
| to the console. |
| </li> |
| |
| <li>stop the server with |
| <pre> |
| derby stop |
| </pre> |
| </li> |
| </ul> |
| |
| </body> |
| </html> |