This project is read-only.

Test Harness Usage

Download the Test Harness from the Releases/Downloads page.
The Test Harness was created in order to maintain quality in the component. Several standard tests were developed to exercise most (definitely not all) capabilities and expected behaviour of the component. The project coordinator runs these tests frequently in order to detect code changes that inadvertently change component results. New issues identified with the component will typically be associated with a test definition that describes the desired behaviour of the component. Those tests will be saved and accumulated to ensure continuous quality improvement.

Test Harness Contents

The test harness contains a Visual Studio solution containing multiple Integration Services projects. Currently, the test harness is only available in BIDS 2008. The project named "Kimball SCD Test Harness.dtsx" is the master package for testing - the other packages should not need to be opened in BIDS, or edited. The other packages are called by the master package in order to run standard and non-standard tests.

Configuring the Test Harness

Open the solution, and open the "Kimball SCD Test Harness.dtsx" project.
  1. Open the Variables window, and edit the TestHarnessFolder variable to refer to the folder where you unzipped the test harness solution.
  2. Open the "KimballSCDTestData" Excel Connection Manager, and edit the path to refer to the XLSX (a Microsoft Excel 2007 spreadsheet) file stored in the folder where you unzipped the test harness solution.

What the Test Harness Does

The test harness opens the Excel file and reads data stored on multiple sheets within that file.
The first sheet is labeled "Test Setup" and contains the test "definitions" - their code, a little description of what they're testing, which test package they need to run, and whether the test should be run.
The next sheets contain input data for the tests - one sheet for each input, one sheet for the expected output of the tests, and one sheet for any expected "invalid input".
When the master package runs, it creates a temporary folder (called KSCDTest) within the folder containing the solution, and creates a text file to hold results of the tests.
The master package then loops over the rows on the "Test Setup" sheet, determining if each test should be run. If the test is enabled, it creates a subfolder for that test within the KSCDTest folder, then reads in the three "input" sheets and saves that information (fixing data types) into RAW files inside that folder. It will use those RAW files as sources of data for the test. The sheets in the Excel file have a column called "Test Codes" that contain information about which tests each row in the spreadsheets applies to. Most rows in the spreadsheet apply to multiple tests - so the test codes each apply to are surrounded by pipe ("|") characters to allow pattern matching to find the applicable rows when running a test. The RAW files will be filled only with the rows from the spreadsheet that apply to the currently running test.
The master package then executes the DTSX file identified in the test definition. Most tests use the "standard test" child package, but some use other packages. The "standard test" package is used when scenarios related to specific arrangements of input row data are to be investigated or tested. The other packages are built for special cases when scenarios with special configurations of the component, or data types have to be investigated or tested.
The child packages are all of roughly the same format - they read in the data from the RAW files, conduct SCD processing using the component, collect all of the outputs (except Auditing and Invalid Input) into one output RAW file where each row identifies which output it came from. The Auditing and Invalid Input rows are written to other individual RAW files for later analysis.
After the child package executes, if it had an error, that error is trapped and written to the results file. If it completed, the results are analyzed - the output rows are compared to the expected outputs for the test from the Excel sheet, and exceptions are noted in the results text file.
After all tests are performed, the text file that has been collecting the results is shown.

Creating Your Own Test

If you experience unexpected behaviour with the component, the best way to inform developers what is incorrect is to be able to reproduce the behaviour. Hopefully, you can reproduce the behaviour with your production data, and determine what characteristics of that production scenario cause the unexpected behaviour. It's quite understandable that if you're able to do so, you're probably not going to want to (or be able to) send that production scenario to the developers for analysis. The next best thing is to create a similar scenario using the test harness.
Open the Excel file, add a new test to the "Test Setup" page. Disable all other tests. Add new rows to the inputs and expected outputs sheets that are contrived to demonstrate the unexpected behaviour you experienced. Run the test harness to ensure it reports an exception to the expected results. Once that is complete, post a new discussion or issue, and attach the spreadsheet to it.

Last edited Nov 17, 2009 at 6:00 AM by toddmcdermid, version 2


No comments yet.