JUnit test report enrichment with JavaDoc

2019-01-16 10:43发布

问题:

For a customer we need to generate detailed test reports for integration tests which not only show, that everything is green, but also what the test did. My colleagues and I are lazy guys and we do not want to hack spreadsheets or text documents.

For that, I think about a way to document the more complex integration tests with JavaDoc comments on each @Test annotated method and each test class. For the test guys it is a good help to see to which requirement, Jira ticket or whatever the test is linked to and what the test actually tries to do. We want to provide this information to our customer, too.

The big question now is: How can we put the JavaDoc for each method and each test class into the JUnit reports? We use JUnit 4.9 and Maven.

I know, that there is a description for each assertXXX(), but we really would need a nice HTML list as result or a PDF document which lists all classes and there documentation and below that all @Test methods and their description, the testing time, the result and if failed, the reason why.

Or is there another alternative to generate fancy test scripts? (Or should we start an OpenSource project on this!? ;-) )

Update: I asked another question on how to add a RunListener to Eclipse to have it also report in Eclipse when started there. The proposed solution with a custom TestRunner is another possibility to have the test results report. Have a look: How can I use a JUnit RunListener in Eclipse?

回答1:

One way to achieve this would be to use a custom RunListener, with the caveat that it would be easier to use an annotation rather than javadoc. You would need to have a custom annotation such as:

@TestDoc(text="tests for XXX-342, fixes customer issue blahblah")
@Test
public void testForReallyBigThings() {
    // stuff
}

RunListener listens to test events, such as test start, test end, test failure, test success etc.

public class RunListener {
    public void testRunStarted(Description description) throws Exception {}
    public void testRunFinished(Result result) throws Exception {}
    public void testStarted(Description description) throws Exception {}
    public void testFinished(Description description) throws Exception {}
    public void testFailure(Failure failure) throws Exception {}
    public void testAssumptionFailure(Failure failure) {}
    public void testIgnored(Description description) throws Exception {}
}

Description contains the list of annotations applied to the test method, so using the example above you can get the Annotation TestDoc using:

description.getAnnotation(TestDoc.class);

and extract the text as normal.

You can then use the RunListener to generate the files you want, with the text specific to this test, whether the test passed or failed, or was ignored, the time taken etc. This would be your custom report.

Then, in surefire, you can specify a custom listener, using:

<plugin>
  <groupId>org.apache.maven.plugins</groupId>
  <artifactId>maven-surefire-plugin</artifactId>
  <version>2.10</version>
  <configuration>
    <properties>
      <property>
        <name>listener</name>
        <value>com.mycompany.MyResultListener,com.mycompany.MyResultListener2</value>
      </property>
  </configuration>
</plugin>

This is from Maven Surefire Plugin, Using JUnit, Using custom listeners and reporters

This solution has the disadvantage that you don't have the flexibility of javadoc as far as carriage returns, formatting is concerned, but it does have the advantage that the documentation is in one specific place, the annotation TestDoc.



回答2:

Have you looked at Maven Sure-fire reports?

You can generate a HTML report from your JUnit Tests.

http://maven.apache.org/plugins/maven-surefire-report-plugin/

I'm not sure how customizable it is though. But it's a good starting point.

I also know that TestNG ( alternative to JUnit ) has some report generating capabilities. http://testng.org/doc/documentation-main.html#logging-junitreports

I would also recommend log4j http://logging.apache.org/log4j/1.2/manual.html



回答3:

you can use jt-report an excellent framework for test reporting.



回答4:

I have created a program using testNG and iText which outputs the test results in a nice pdf report. You can put a description of your test in the @Test tag, and that can be included in the .pdf report also. It provides the run times of the tests, and for the entire suite. It is currently being used to test webapps with selenium, but that part could be ignored. It also allows you to run multiple test suites in one run, and if tests fail, it allows you to re-run only those tests without having to re-run the entire suite, and those results will be appended to the original results PDF. See below the image for a link to the source if you are interested. I wouldn't mind this becoming an opensource project as I have a good start on it, though I'm not sure how to go about doing that. Here's a screenshot

So I figured out how to create a project on sourceforge. Here's the link sourceforge link



回答5:

As mentioned above maven is definitely the way to go.. It makes life really easy. You can create an maven project pretty easy using m2eclipse plugin. Once that is done. Just run these commands:

cd <project_dir_where_you_have_pom_file>
mvn site:site

This command will create the style sheets for you. In the same directory run:

mvn surefire-report:report

This will run the test cases and convert the output to html. You can find the output in the 'target/site/surefire-report.html'.

Below is the snippet. As you can see all the test cases (written in JUnit) are shown in the html. Other meta info like total no of test cases ran, how many successful, time taken etc., is also there.

Since I cannot upload image I cant show you the output.

You can go a step further and can give the exact version of the plugin to use like

mvn org.apache.maven.plugins:maven-site-plugin:3.0:site org.apache.maven.plugins:maven-surefire-report-plugin:2.10:report


回答6:

Maybe it is worth taking a look on "executable specification" / BDD tools like FIT/FitNesse, Concordion, Cucumber, JBehave etc.

With this practice you will have a possibility not only satisfy the customer's requirement formally, but you will be able do bring transparency onto a new level.

Shortly speaking, all these tools allow you (or, better, customer) to define scenarios using natural language or tables, define binding of natural language constructs to real code, and run these scenarios and see if they succeed or fail. Actually you will have a "live" spec which shows what is already working as expected and what is not.

See a good discussion on these tools: What are the differences between BDD frameworks for Java?