Mobile app test automation

App Logo
Our iOS and Android apps are amongst the most popular apps in the Netherlands.  Our apps were launched in 2011, in 2012 we had 3 releases a year per platform.   This has now grown to releasing every 2-3 weeks.   The way we developed and tested the apps had to grow from a 2 person team to a cross functional team to cover iOS, Android , API development, testing, design and UX.

Regular Releases

We review the comments made in the App and Play stores and try to include fixes or new features that users have requested in every release.  This has led to a continual rise in our app store reviews and ratings.  To help increase the speed in which we can release, we have automated as much as we can.

In general, we release to Android first, with a small percentage of our users getting an update, we monitor for crashes or other issues and then increase the percentage of users that get the new app.  To become a Beta user and get early access to our new releases – go here.


When new app code is pushed to the develop branch – a test run is started in a remote device cloud, where we can choose various screen sizes, phone and tablet, different rotations, OS versions and device make and models.   If all of the regression test runs are green, then we will build a release candidate which is available to everyone in the teams to download and test with.

iPhone Device Cloud

We use the open-source test framework Calabash for our automated regression acceptance tests enabling us to write the same tests for Android and IOS despite different UI patterns on the different platforms.  The tests are run on a mixture of local devices and the Xamarin device cloud

Reporting of Tests


We use Jenkins for our CI environment, and all jobs are shown like this on monitors around the office

– Green background means the last run passed without failures,

– Red background means it had failures with the final failure count is shown.

– The dial icon means the tests are in progress

– Orange background with dial means it failed last run but no failures yet this run

– Red background with dial means its already failing and shows current failed test count.

This is really useful because at a glance we can see how ‘red’ a build is. We can already look at failures while a test run is ongoing.

By clicking on the box we can look at the failing tests


We have the Test Feature name, Scenario name, screenshot and error message

On the right hand side there is an indication of the status of the last 30 runs.

You can see that this is not a flaky test but is actually failing because of a push 5 runs ago.

How does it work?

In Cucumber, we can use the ‘After’ hook to execute code after every test is completed.  We call our internal test reporting api with the details of the scenario, name, status (PASS|FAIL) and if failed a screenshot and stack trace of the error.

As a future improvements – we would like to know the memory usage and the CPU usage so we could show trends and highlight if we are suddenly using more memory than normal

Rerunning Failed Tests

Due to the fact that we are running against a test environment also used for backend testing, which can have services restarted or broken at any point, we have added a retry mechanism for failing tests.

In Jenkins, within the same job, when the first test run is completed the status is checked, if there are failed tests, then the second run is started for only the failed tests, when this is completed a third and final run is started if there are still failing tests.

Example from our jenkins console log

2015-03-17 18:49:56 +0100 Status: Testing: 1 running, 0 enqueued, 0 complete…

2015-03-17 18:50:06 +0100 Status: Finished Done!

Total scenarios: 21

20 passed

1 failed

Total steps: 63

Test Report:

Should retry

2015-03-17 18:50:33 +0100 Status: Testing: 0 running, 1 enqueued, 0 complete…

2015-03-17 18:54:42 +0100 Status: Finished Done!

Total scenarios: 1

1 passed

0 failed

Total steps: 4

Test Report:

Should not retry

How does it work?

Cucumber allows us to use the ‘Around’ hook to determine if the test scenario should run or not.

Around do | scenario, block|

if should_run_scenario?(scenario)



We call our reporting api and get a json response of the scenarios in the previous attempts within the same Jenkins run and whether they passed or failed

Should_run_scenario? Returns true if it’s the first attempt or the test failed in the previous attempt within the same Jenkins run.


Mobile Automation is still quite early in its development but we have seen vast improvements in the stability and reliability of the tests we are running.   Its not at the same level of sophistication as browser testing with Selenium but its getting better and having the ability to run on devices rather than simulators or emulators has increased our stability and coverage.


Cucumber –

Calabash –

Xamarin –

iOS Accessibility

Our iOS and Android apps are the most popular Dutch apps in the Apple and Play stores.  Millions of Dutch people use our app everyday, including those with visual impairments.

We thought that we had a good accessibility app as we used the Accessibility labels for our automated testing.  Everything had a label so we thought that was good enough.

However, one of the unintended side effects of our iOS 3.0 app release was that we had unknowingly broken a lot of the functionality of our app when VoiceOver was on.

A visually impaired user contacted our customer services to complain that the app had become unusable and we asked him to help us. We visited him and watched how he used the app and where he got stuck.

Here is what we have learned.

Getting Started

Switch on VoiceOver shortcut.

Go to Settings > General > Accessibility > Accessibility Shortcut off > VoiceOver

This enables you to use the switch VoiceOver on anytime by clicking the home button 3 times quickly.


Learn the gestures.

The basic navigation is a swipe from left to right across the screen anywhere on the screen.  When a page is loaded the first element is highlighted and the text is read out – if its an icon there should be an accessibilityLabel that is read out instead.

The swipe is used to move onto the next element to be read out and if they want to ‘touch’ that item then they double click with one finger anywhere on screen

Copy the native apps 

For us the mail app was the closest to our own.  We based how our app should behave by comparing to the mail app.



Each ‘result’ of the list should be read out as one VoiceOver text.  The different elements making up the result should be read out in order of importance. The user want to be able to work out as quickly as possible if they want to keep listening or move onto the next.

Our app was reading each individual text element so of the user wanted to skip that result then they had to skip 4 or 5 times to get to the next result.  The user should be able to drag a finger down the list and each click is a result – so they can skip the first 3 results in the list by dragging from top to bottom for 3 clicks.

IMG_2656IMG_2658IMG_2657 IMG_2659

Next Improvements

We hadn’t realised that it was also important to know where you were when you returned to this list.  In our app, the result opens an advertisement for an item for sale.  When the user wants to go back to the list – the back gesture is a two-fingered backwards Z that starts from the bottom and goes to the top of the screen – then the same result should be still selected and VoiceOver should read out the contents again. This verifies to the user that they are back at the same point and can continue scanning the list.

Accessibility labels for icons

Our screen for a single advertisement has an action bar at the bottom.

Ad Screenshot

To our eternal shame – even the Contact and Place Bid (Plaats Bod) buttons, the two most important functions on the screen, were not available to visually impaired users.  All the actions on the bottom bar were obscured by one label that was read out as “VIP floating bar”.

We saw that the user had to switch off VoiceOver, and select Contact, then switch VoiceOver back on again to be able to contact the seller about the item for sale.

We fixed this so that the 4 actions are read out as individual elements, Contact, Place bid, Add ad to my favorites, and Share.


Screen Shot 2014-03-21 at 10.37.49 PM

We had written our own custom image picker.  This had given us the ability to select more than one image from the library at the time. A handy feature for most of our users but we lost entirely all the accessibility features the native image library has that we were not even aware of.

It was not possible to select a photo at all when VoiceOver was on.  The user had to switch it off pick and image and switch it on again.  There is meta data for an image which should be read out – containing the time it was taken and some idea of the sharpness and lightness of the image.  Its also possible to add a label to a photo from the native image library.  Our app should allow that label to be read out too.

Our solution is to detect if VoiceOver is on and use the native image library instead of our own.  This way we get all the native accessibility features for free.

Adding a label to an image


Go to the native Photos app, turn on VoiceOver, navigate till an image is highlighted. This will read out the metadata.  To add a label you need to tap twice with two fingers but hold down on the second tap till you hear three pings.  A popup should appear where you can add text.  This one takes practice, 2 taps on its own just starts your Music playing.

Screen Shot 2014-03-21 at 10.37.31 PM

Next Improvements

Our user had a simple request for this screen.  When a photo has been added, read out the label and meta data – on the empty tiles say ‘add photo’.  We had everything saying tile 1, tile 2, tile 3, no matter if an image had been added already or not.


Try it for yourself or even better watch a real user using your app with VoiceOver. For visually impaired people the technology in their iPhone makes life easier for them.  Make sure your app does too.


Further reading

This is a great blog by Matt Gemmell about Accessibility on iOS apps.