Friday, February 29, 2008

NANT Scripts Rock

I recently had the fortunate opportunity to be given a project of my own to work on.  I would say that Agile practices are extremely important to me and I do my best to put these into practice every day.  Currently we use CruiseControl.NET as out continuous integration server and I wanted to get our tests integrated into the process, so I decided to create a new build script and create a new cruise control project for this purpose.  For those of you who are pausing and saying...What,  you do not have your tests integrated into your build.  I agree.  Sometimes things take a little longer to get into place than we would like, but we are getting there slowly but surely.


I must say that after my experience today I have an even higher level of respect for build/release engineers.  It took me all day to get the nant script to do as I wanted it to do and to get the test results output I desired.  As a practice we keep all of our test projects in a directory named tests at the root of our directory hierarchy.  I created a new solution that contained only the tests that I wanted to integrate into the build.


The build script runs msbuild on the solution file and then does some magic to copy the directory structure into a build-temp directory which is the directory where the nunit-console.exe is applied to the dll.  The process is broken into a number of tasks namely build the solution, copy the test project structure to a build-temp directory, run nunit-console.exe on the test dll's and then copy the results of those tests to the test-results directory.  The script looks a little like this:


This copies the results of the build into a build-temp directory


<target name="copy-tests" depends="build">
     <echo message="Copying tests to ${build.dir}\tests" />
     <mkdir dir="${build.dir}\tests"/>
     <copy todir="${build.dir}\tests">
        <fileset basedir="tests" failonempty="true">
          <include name="*PatternOfTestsToInclude*/**"/>
        </fileset>
    </copy>
</target>


This iterates through the directories and executes nunit-console.exe on each Test dll file.  It moves the results of the test to the test.results directory

<target name="tests" depends="copy-tests">
     <echo message="Create directory ${results.dir}"/>
     <mkdir dir="${results.dir}"/>
     <echo message="Create directory ${test.results}"/>
     <mkdir dir="${test.results}"/>
     <echo message="Running Nunit Tests"/>
     <foreach item="Folder" property="foldername">
         <in>
             <items>
                 <include name="${build.dir}\tests\*"/>
             </items>
         </in>
        <do>
            <echo message="Iterating through folder ${foldername}\bin\${build-configuration}"/>
            <foreach item="File" property="filename">
                <in>
                    <items>
                         <include name="${foldername}\bin\${build-configuration}\*Test*.dll"/>
                    </items>
                </in>
        <do>
            <echo message="Running test for file ${filename}"/>
            <echo message="Writing test results to test-results.xml for dll ${filename}"/>
            <exec program="${nunit-console.exe}" failonerror="false" resultproperty="testresult">
                <arg value="${filename}"/>
                <arg value="/xml=test-results.xml" />
            </exec>
            <property name="niceFileName" value="${path::get-file-name-without-extension(filename)}"/>
            <move
                 file="test-results.xml"
                 tofile="${test.results}\${niceFileName}-test-results.xml"
                 overwrite="true"/>
            <fail message="Failures reported in unit test for ${filename}." unless="${int::parse(testresult)==0}" />
        </do>
    </foreach>
</do></foreach></target>


This approach is cool because I can inject into the script the pattern I want to use to select the test groups I want as a part of my test suite.  In my situation this makes sense because I only want those tests that are specific to my project.  As long as you use some standard naming conventions for your tests then you are good.


Powered by Qumana


No comments: