To start TestNG is designed to be a unit testing framework,
hence its reporting is confined. But the trouble starts when we start using it
for functional testing and expect reports to support automation needs.
Using Selenium Web driver in conjunction with TestNG or any unit
Testing framework gives us more flexibility in organizing and executing them.
As a result we are stuck to basic unit testing reports which doesn't fit the
bill.
Given the best scenario, every tester/developer loves to see all
tests passing after Automation run and which happens seldom at last stages of
product/project but tests and code are bound to be fragile (in turn fail) during
initial stages. Hence dealing with failed scenarios is mandatory and major task
while preparing tests.
To analyze failed scenarios we resort to information provided in
current reports (stack trace) and if still unclear we re-run scenarios and
verify manually, which is why unit testing reports doesn't fit the bill for
selenium automation. Since we couldn’t figure out what exactly happened, we end
up running tests with manual efforts.
Considering TestNG, its report Structure contains following:
1. Test suites
2. Test cases
3. Test Status
(Pass/Fail/Ignored)
a.
Exceptions in case of failures
4. Description
of Test Cases
5. Logging
information.
Here are some required information which can help to zero-in
issue with failing scenario whether the test is flaky or the application went
wrong somewhere:
§ Screenshots of browser:
After seeing exception first thing most of us would love to see is screenshots
of browser for any hints/idea what happened even before visiting Codebase.
§ Browser & Platform info: Right now this can be included manually creation of reports. Eventually
in automation everyone tries to run tests over different browser environments
rather than manually keeping information overhead it would be good to map execution
of tests w.r.t browser environments by default.
§ File Bug directly from Report: Any one can report bug directly from the report itself. While
reporting, it should also attach all info from report (Screenshots, exception, Test
description & Logged info) to respective Bug.
§ High Level Reports: Reports
by default assumes audience as Developer/testers and misses others. It should
also provide to give high level representation of test data other than detailed
ones.
§ Error Levels: Going
a point further it would be good to categorize errors as per need. This helps
to address critical errors first and later can move on to mild errors later. All
assertion errors are Critical errors where the
actual & expected results don't match and can be potential areas for
bugs/flaky tests. Next comes Mild errors potentially
exceptions such as “NoSuchElementExceptions“& “InvalidElementStateException“.
§ Logging Info: Current
TestNG report does display the logged info, but it’s displayed separately in
another tab/page, whereas it should be displayed in the same area where the
test status is displayed. To save efforts at the same time similar info is
present at the one place.
Though there are few options available to enhance TestNG reports
to further levels.
ATU Reporter:
An open source reporting utility and can be customized to create graphical
reports. Yet it doesn’t fulfill the most important ones.
In TestNG we can create custom reports by implementing IReporter
interface and building simple reports are quite easy. But the more complex
reports you want them the complexity to implement increases exponentially.
Implementing above requirements using TestNG interfaces is quite challenging
and any small customization involves lots of code change.
References
No comments:
Post a Comment