Index Changes

Difference between version and version     

Back to IEPSE Driver Tests, or IEPSE Driver Tests Info


At line 3 changed 4 lines.
To test an individual project
#Open the comp app project (e.g. open-jbi-components\driver-tests\iepse\IEPDeleteStream\DeleteStreamApp)
#Right click on the comp app project node and select "Clean and Build". The build should show a message that it was successful.
#Right click on the comp app project node and select "Test". The Netbeans "JUnits Test Results" window will show the results. The right side will have the name of the test that failed and right side will have the logs from the Netbeans Composite App Test Manager (the module that executes the tests).
!!IEP Regression Testing
These tests are continuously being run against the IEP codebase (usually trunk) to test for regressions. Following combinations of OS and database are currently being tested: NT/Derby, Solaris/Oracle, NT/DB2. We are working towards making the test results available on java.net.
At line 8 changed 3 lines.
Some notes on the tests
*The tests have been written using N2M test framework. Please see [for details|http://wiki.open-esb.java.net/Wiki.jsp?page=CreatingAnN2MTest]
*
!!To test an individual project
#Open the Composite Application (CompApp) project (e.g. open-jbi-components\driver-tests\iepse\IEPDeleteStream\DeleteStreamApp)
#Right click on the CompApp project node and select "Clean and Build". The build should show a message that it was successful.
#Right click on the CompApp project node and select "Test". The Netbeans "JUnits Test Results" window will show the results. The right side will have the names of the tests that failed and the right side will have the logs from the Netbeans JBI CompApp Test Manager (the module that executes the tests).
!!Notes on the tests
*The tests have been written using JBI Test Manager's N2M test framework. Please see [for details|http://wiki.open-esb.java.net/Wiki.jsp?page=CreatingAnN2MTest]
*Many tests have wait statements (in the test script file). These waits are used to introduce time lags between sending events. These waits are useful to make the test outputs predictable in some cases. In the case of TupleBasedWindow, if events were sent with no lag, the operator may treat those events differently than the events sent with a time lag. For example, if the size of the window is 1 and five events are sent simultaneously, then only one event can occupy the window because the size is 1. The event which gets retained in the window could be any one of the five events. So, to make the test more predicatable it is better to send the five events with a lag of 1 second between each event. Other examples are cases where the operator has some time related property, for example, TimeBasedWindow, TimeBasedAggregator, NotificationWindow and BatchedOutputStream. There could be other reasons why the time lags help. We found that a large number of test cases could be made predictable by adding appropriate lags.
*The tests can be run using any database that IEP supports. In a handful of cases, the output for a test can differ based on the database being used. The N2M framework's "Context based configuration" was used to support such test cases. For example, see the test case open-jbi-components\driver-tests\iepse\IEPRelationAggregator\RelationAggregatorBApp\test\TupleBasedWindowInput. The test has a TupleBasedAggregator that uses AVG function. The AVG is applied to intergers which are part of the incoming events. The result of the operation is different for Derby and Oracle because each uses a different rounding off scheme. This results in different output files for Derby and Oracle. Since both are valid outputs, "Context based configuration" was used to specify the different output files based on the context (in this case the database being used). Other cases where the outputs can differ is when the RelationOutput is used as the output operator. There could be other cases which needs to be evaluated case by case.
*Some tests pass with one database and fail with others. At this point more tests pass on Derby than on Oracle. In such cases, the test has been marked as in progress for the plaform on which it fails (featurestatus=progress) and a bug filed. For example, see open-jbi-components\driver-tests\iepse\IEPRelationAggregator\RelationAggregatorApp\test\GapWindowInput. The corresponding bug is https://open-esb.dev.java.net/issues/show_bug.cgi?id=1311
*The N2M framework supports a special wait command for testing TimeBasedAggregators. Please see [the section "Special wait command for IEP" for details|http://wiki.open-esb.java.net/Wiki.jsp?page=CreatingAnN2MTest]. For an example, please see the test case open-jbi-components\driver-tests\iepse\IEPAttributeBasedWindow\AttributeBasedWindowApp\test\TimeBasedAggregatorInput
*During test development, we found that sometimes, the test definition needed to be changed to make the test workable/predictable for a particular database. In such cases, it is important to ensure that the changed test case works for all platforms.

JSPWiki v2.4.100
[RSS]
« Home Index Changes Prefs
This page (revision-7) was last changed on 13-Feb-09 17:07 PM, -0800 by Prashant Bhagat