Skip to content
Matthew Casperson edited this page Aug 8, 2019 · 5 revisions

Best Practices

Iridium has evolved a reasonably complex set of steps that can be used to test or define the expected behaviour of a web application. This raises the question of best practises when writing Iridium test scripts. In this article we’ll take a look at the features in Iridium that allow you to write test scripts that are reliable, easy to read and easy to maintain.

Comments

Gherkin allows you to add comments after the hash symbol (#), or as free text underneath Features or Scenarios.

The code folding feature in Atom will display free text comments when a Scenario is folded. This gives the reader a way to fold long scripts while being able read these free text comments.

Important
It is recommended that free text comments be used under Feature or Scenario headings instead of adding comments with a hash symbol.

A question of timing

When interacting with web applications the issue of timing will inevitably arise. This is because the performance of web apps is highly dependent on every piece of infrastructure sitting underneath (networks, databases, web servers etc) and also on the characteristics of the browser being tested.

Iridium deals with these performance inconsistencies by either explicitly or implicitly waiting for a period of time for elements to be available on the page or before steps are executed.

Delaying step execution

The delay between each step can be configured with the step:

I set the default wait time between steps to "5" seconds

This step defines the amount of time that Iridium will wait between steps that interact with the web page. By default the delay is set to 2 seconds. This delay means that the test script will not zip through the test script at inhuman speeds, which is important because often web apps will hide and reveal elements with animations in response to interactions in a way that is fast enough not to hold up a human user, but may be too slow when a computer is interacting with the page at full speed.

Important
It is recommended that the delay be set to at least 1 second to simulate the delay of a human user as they move the mouse or tab over the forms.

Explicitly waiting for elements

As new pages are loaded we need some way to know when the page is ready to interact with. Typically this is done by pausing the execution of the script until an element, like a page heading, is displayed. This can be achieved with the step:

I wait "30" seconds for the element found by "Page Heading" to be displayed

This step pauses the execution of the script for up to 30 seconds until the desired element is displayed on the page. There are other variations like this step, which waits for an element to be clickable:

I wait "30" seconds for the element found by "Submit Form" to be clickable

And this one, which waits for the element to be present in the DOM but not necessarily displayed on the page:

I wait "30" seconds for the element found by "Submit Form" to be present

Important
It is recommended that explicit wait steps be used to indicate points in the test that are expected to take some time to complete.
Important
Be aware that services like BrowserStack place a limit on how long the browser can be idle for before terminating the session. For this reason waiting longer than 90 seconds is not recommended.

Implicitly waiting for elements

Every step in Iridium will wait a short amount of time for the element that it interacts with to be available before continuing. For example the step:

I click the element found by "Submit"

will by default wait up to 2 seconds for the Submit button to be clickable before the step is executed. This delay means that the web app is given some leeway by the test script for the required elements to be available on the page.

You can define the amount of time that steps will wait for elements to be available with the step:

I set the default wait for elements to be available to "30" seconds

Setting the default wait time to 30 seconds means that steps will wait up to 30 seconds for the elements that they interact with to be available. If the elements are immediately available then the steps will be executed immediately, so setting the default wait time to a high value won’t make your tests scripts slower, it just makes them more tolerance of web apps that are slow to load and update.

Important
If tests should fail because of a slow web application, it is recommended that you set the default wait time to a small value. Otherwise set the default wait time to a high value to allow to test to complete when there are performance issues.

A question of abstraction

The Gherkin dialect exposed by Iridium has to walk a line between exposing steps that allow web apps to be thoroughly tested while not relying on detailed implementation details like the IDs, xpaths or css selectors of the underlying HTML.

This is an example of a step that does rely on implementation details:

I click the element with the xpath of "//*[@id='status-image-popup']/img"

This is a perfectly valid Iridium step, but is not particularly readable. You have to have a very good understanding of HTML or XML to know what an XPath is, and even then it is often difficult to understand what element an XPath is referring to.

A better solution is to refer to an alias. First you define the alias value with the step:

And I set the alias mappings
            | Build Status           | //*[@id='status-image-popup']/img |

And then reference the aliased value with the step:

I click the element with the xpath alias of "Build Status"

This references the cryptic XPath via a human readable alias, makes the step much more readable.

But there is another downside to this step, which is that it adds a dependency on the XPath of the element. If you are writing a test script before the web app has even been designed, there is no reason to think that an XPath is the best way to select the element.

A more readable solution is to use a “found by” step, like this:

I click the element found by alias "Build Status"

Steps with the phrase “found by” will scan for elements based on their:

  • name attribute
  • value attribute
  • id attribute
  • class
  • XPath
  • CSS Selector
  • Normalized text content

What this means is that we no longer need to define how an element is found. Rather we provide a value that we know uniquely identifies the element and let Iridium try the various combinations until the element is found.

We can actually go one step further and rely on the AutoAlias functionality in Iridium to make the step even more readable.

I click the element found by "Build Status"

When AutoAlias is enabled (and it is enabled by default), Iridium will first attempt to find the value being used from the alias mappings. If an alias is found, the value it references is used. Otherwise the supplied text is assumed to be one of the 7 types of selectors listed above.

You can specifically enable AutoAliasing with the step:

I enable autoaliasing

And disable it with the step:

I disable autoaliasing

Important
It is recommended that you leave AutoAliasing enabled, that you use the “found by” steps without specifically referencing an alias, and that any selector that is not self evident be referenced by a human readable alias name.

A question of code duplication

Using tags

Gherkin allows you to tag scenarios, which can be a handy way to implement test paths in a feature. For example, this feature uses tags to define two different test paths for development and production environments:

Feature: Buy a product
  @dev @prod
  Scenario: Add a product to the shopping cart
    Given I open the application
    And I click the element found by "Playstation 4"
    And I click the element found by "Add to cart"
    And I click the element found by "Checkout"

  # Production should always recommend the VR headset.
  @prod
  Scenario: Verify product recommendations
    Then I verify that the page contains the text "Customers who bought the Playstation 4 also bought the VR headset"

  # Only purchase items in the development environments
  @dev
  Scenario: Buy the product
    # Dev environments always have elements in stock
    Then I verify that the page contains the text "Item is in stock"
    And I click the element found by "Buy Now"
    Then I verify that the page contains the text "You have just purchased a Playstation 4"

Running Iridium with the command:

java -DappURLOverride=https://example.org -DtagsOverride=@dev -DtestSource=C:\playstation.feature -DtestDestination=Chrome -jar IridiumApplicationTesting.jar

will run all scenarios tagged with @dev. Alternatively running the command:

java -DappURLOverride=https://example.org -DtagsOverride=@prod -DtestSource=C:\playstation.feature -DtestDestination=Chrome -jar IridiumApplicationTesting.jar

will run all scenarios tagged with @prod.

Using tags allows you to share common scenarios while including (or excluding) various scenarios for for a given execution path.

Using fragments

Iridium allows you to split up test scripts into multiple files and then import them, for example:

Feature: Test Home Quote and Sales (HQS)

    @home-launcher
    Scenario: start HQS application
        #IMPORT: home/home-launcher.fragment

    @quote-journey
    Scenario: YOUR DETAILS PAGE
        #IMPORT: home/quote/cover-details.fragment

Splitting up and importing content is valuable when you are sharing content between scripts. For example, you may have a number of common security tests that you run across all your applications, or each of your tests shares a common login procedure. In these cases, the #IMPORT statement can be used to remove copy and paste content, which is valuable because copy and paste code makes tests hard to maintain.

But use the #IMPORT feature sparingly. It is difficult to jump between files and keep track of what the test script is doing in your head. You’ll find that IDE features like code folding are much better solutions that the #IMPORT feature for managing long test scripts.

Important
It is recommended that tags be used to share common scenarios and create execution paths in a feature. It is recommended that the #IMPORT feature only be used to reference duplicated content between features.

A question of inputs

There are two options when it comes to providing inputs that are to be looped over during a test: Scenario outlines and Iridium Dataset collections.

Scenario Outlines

The first is to use Gherkin scenario outlines. These provide a table of values that are passed into iterations of a scenario:

Feature: Open an application

 # This is where we give readable names to the xpaths, ids, classes, name attributes or
 # css selectors that this test will be interacting with.
  Scenario: Generate Page Object
    Given the alias mappings
      | Search Menu  | dropdownMenu2 |
      | Search Field | search        |

  # Open up the web page
  Scenario: Launch App
    And I set the default wait time between steps to "2"
    And I open the application
    And I maximise the window
    And I click the element found by alias "Search Menu"

  Scenario Outline: Test the search box
    And I clear the hidden element with the ID alias of "Search Field"
    And I populate the element with the ID alias of "Search Field" with "<search>"

    Examples:
      | search |
      | Java   |
      | Devops |
      | Linux  |
      | Agile  |

Scenario outlines are a great way to embed test data into a scenario when testing an individual feature, as we have done in the example above where a scenario outline has been used to test a search box.

Data Sets Collections

Data set collections are a feature unique to Iridium. They are collections of alias mappings that are passed into iterations of a feature.

Data sets are defined in an XML or CSV file, for example:

<profile>
   <dataSets>
      <commonDataSet>
          <setting name="CarRegoValue">ASD123</setting>
      </commonDataSet>
      <dataSet>
         <!-- Car Details -->
         <setting name="CarDetailsYearIndex">2</setting>
         <setting name="CarDetailsMakeIndex">1</setting>
         <setting name="CarDetailsTransmissionIndex">2</setting>
         <setting name="CarDetailsFuelIndex">1</setting>
         <setting name="CarDetailsModelIndex">2</setting>
         <setting name="CarDetailsRedbookCodeIndex">2</setting>
         <setting name="CarDetailsUseIndex">1</setting>
         <setting name="CarDetailsOdometerIndex">1</setting>
         <setting name="CarRegoValue">ASD123</setting>
      </dataSet>
      <dataSet>
         <!-- Car Details -->
         <setting name="CarDetailsYearIndex">3</setting>
         <setting name="CarDetailsMakeIndex">2</setting>
         <setting name="CarDetailsTransmissionIndex">3</setting>
         <setting name="CarDetailsFuelIndex">2</setting>
         <setting name="CarDetailsModelIndex">3</setting>
         <setting name="CarDetailsRedbookCodeIndex">3</setting>
         <setting name="CarDetailsUseIndex">2</setting>
         <setting name="CarDetailsOdometerIndex">2</setting>
      </dataSet>
   </dataSets>
</profile>

You can pass in the datasets by running Iridium with the command:

java -DappURLOverride=https://example.org -Ddataset=inputs.xml -DtestSource=C:\playstation.feature -DtestDestination=Chrome -jar IridiumApplicationTesting.jar

This command will run the test script twice, once for each dataset in the XML file. The results are then merged into a single result contained in the JUnit XML file MergedReport.xml.

Comparison between data sets and scenario outlines

Scenario outlines are similar to alias mappings in that they both map values to keys that are referenced by steps.

The big difference between data sets and scenario outlines is their scope. Scenario outlines define values that are referenced from a single scenario, whereas datasets define values that are used across the entire feature.

Important
When testing an isolated feature that can be contained in a scenario, it is recommended that a scenario outline be used. When writing end to end tests that reference different related variables across scenarios, it is recommended that datasets be used.

How to structure tests

See Introducing PICQT for Writing Cucumber Tests With Iridium for details on how to structure Iridium tests.

JavaScript Steps vs Custom Events

Iridium does a good job interacting with standard HTML elements like buttons, checkboxes, textboxes etc. But often you will have to interact with advanced, custom widgets like sliders, address capture elements, maps etc.

There are two ways to interact with these elements.

The first is to use steps that expose custom events, like:

And I "mousedown" "50%" horizontally and "50%" vertically within the area of the element found by "slider"

The second is to run raw JavaScript, like:

And I run the following JavaScript and save the result to alias "Slider Value"
  """
  $("#slider").slider('value', 50);
  return $("#slider").slider( "value" );
  """

Both kinds of interacts have pros and cons.

These are reasons to consider using custom event steps:

  • These steps can be understood by non-developers.
  • These steps accurately emulate the actions taken by end users.

These are reasons to consider not using custom event steps:

  • It can be quite difficult to work out what sequence of events a widget actually responds to. Does it use mouseup, mousedown or click? Do focus and blur events affect the operation of the widget?
  • You will quite often have to emulate multiple events to achieve a single action. This reduces the readability of the script.

These are reasons to consider using raw JavaScript:

  • Modern UI widgets have comprehensive and well documented APIs.
  • It is often easier to interact with complex widgets via JavaScript.
  • APIs usually have some level of consistency and compatibility between versions.

These are reasons to consider not using raw JavaScript:

  • The resulting step will be unreadable by anyone other than developers.
  • Your test script is no longer going to reflect the actions of a real end user.

There is no recommendation between using custom events and raw JavaScript. Neither solution is a clear winner.

The best solution here is to write custom steps that expose interactions with these advanced widgets with concise step definitions. See Extending Iridium With Custom Step Definitions.