Thursday, November 8, 2018
With all the excitement and buzz around continuous test automation for mobile app and mobile web testing, it’s only natural to be curious. What does continuous testing mean for your developers and testers? What can your team accomplish with more speed and automation in your corner?
But, for enterprise mobility teams that are primarily leveraging manual testing the notion of moving testing from “your hand to the cloud” can be a little daunting. But there is no reason to be nervous if you plan carefully and set yourself up for success right from the beginning. A successful beginning yields a smooth and precise continuous testing strategy that increases performance, app quality, DevOps and overall agility.
Through my work with our customers and from helping their enterprise mobility teams on the path to continuous testing, I’ve identified five common roadblocks that mobile developers and testers often stumble over when starting out. Here’s what you can do to avoid these bumps in the road and to begin your continuous testing journey on the right foot.
When getting started with continuous testing, don’t use testers that are geared for manual testing only. Because continuous testing relies on test automation, it is important to have testers on your team experienced with automation principles. While manual testers are useful for making sure that the app is easy to use, that it meets business objectives, and that it is capable of receiving 5-star ratings, test automation requires a different set of skills.
Instead you should dedicate a team in your mobile testing lab with automation experience using their test automation tool of choice. Some automation tools like Appium® may require more experience in coding than Tricentis Tosca® or Micro Focus UFT. The team will need to work on building out a framework and a continuous testing pipeline followed by working together to establish a repeatable set of tests.
I come across a lot of teams doing mobile test automation, but the most successful teams are integrated and/or work closely with the development team. Why are they more successful? I believe it is due to better communication. Let’s compare the experience of being integrated with development as opposed to going at it alone below. I think you’ll discover that you get more out of collaboration then being the Lone Ranger.
Working closely together means that the teams tend to solve issues faster and help each other maintain high quality apps. When working together the goals are aligned and when building out a continuous testing strategy, all the stakeholders are invested in the outcome.
How does this collaboration play out in continuous testing? Well, the DevOps team will help get the automated tests to run with the application build jobs. If tests fail, then development and QA teams are both notified and issues can be solved faster.
Automated tests can be created easier with development buy in. Often the development team does not know the best way to make applications accessible for automation. Without buy-in, testing teams can struggle to write automated tests or require larger efforts than needed if developers do not make the required applications accessible (setting the content-desc and resource-id for android; name and accessibility-id for iOS.)
Don’t get me wrong, there are many teams that just need to produce testing results and are suited to go at it alone. But when teams go rogue it is a harder hurdle to overcome without development buy-in to the testing efforts.
While it may be tempting to use your team’s advanced skills to build your own testing framework, it is a strategy that can sometimes backfire on you. Particularly in the area of support where issues may be hard to diagnose and fix. Often it is necessary and even comforting to have a trusted team of individuals with a knowledge base that you can tap into to help your team push toward a successful continuous testing strategy.
The thing I love about Mobile Labs’ support team is that they provide answers about any automated test scripting questions to our customers. This often includes answering questions that are usually hard to diagnose, such as when a team is using a custom-built framework to manage data, test flows, reporting, and parallelization. It can be a timely process and getting to a resolution can be difficult when you’re going rogue. Instead, it may be easier for teams to get up and running faster with a continuous testing framework by exploring tools and test frameworks that support many different programming languages that have broad support.
Most of these frameworks are free and open source. Leveraging these tools makes it easier for the following reasons:
An established framework can be implemented quickly, and automation tests can be written immediately. There is no time wasted building out a complex testing framework that mimics the same functionality as one already available.
When a problem or a lack of knowledge exists, it is easy to find an example online using the available framework. In most cases the custom-built framework for the company is not documented, so automation users have to rely on trial and error to work within the framework. Should the subject matter expert of the framework leave the company, the automation efforts could stall should the testing project change or new automation efforts are needed.
Most open-source frameworks are adaptable, so changing something that doesn’t work in your environment can be done with little effort. For instance, if you need to have some custom functionality to connect to a device for a test, you can extend the test functions to automatically handle this and keep the framework easy to use.
Implementing parallel testing is hard but worth the effort. In fact, you can see this for yourself with our handy calculator.
For example, parallel testing with Appium requires knowledge of spinning up new threads for execution which is a very complex level of development. What can happen is that some threads start up before others finish and Appium resources and devices can already be in use. Either a test will fail to run or a new test causes an existing test to fail without any evidence to why the test fails. This is difficult to diagnose and debug.
Given the above advice, I will provide one word of caution, don’t go overboard. I have even come across a test framework that used an Excel driven table to drive a TestNG framework that called JUnit tests.
On the subject of testing using simulators v. real devices, I am an advocate for testing on real devices for several reasons. I recently wrote a blog post on this topic, which you can view it here for a more in-depth study.
But to briefly summarize, despite having built a powerful Simulator (Apple) and an Emulator (Android), both vendors still advocate for testing on a real device. Clearly, while simulators and emulators can help developers and testers quickly test certain elements of mobile apps and mobile websites before deployment, they should not be considered a replacement for testing on a real device.
For real results and to get the most accurate insight into how an app or mobile website will function in the real world, enterprise mobility teams should test on real devices.
Consider this excerpt from Apple’s Simulator Help Overview (emphasis added):
“Simulator is a great tool for rapid prototyping and development of your app allowing you to see the results of changes quickly, debug errors, and run tests. It is also important to test your app on physical devices as there are hardware and API differences between a simulated device and a physical one. In addition to those differences, Simulator is an app running on a Mac and has access to the computer’s resources, including the CPU, memory, and network connection. These resources are likely to be very different in capacity and speed than those found on a mobile device requiring tests of performance, memory usage, and networking speed to be run on physical devices.”
Next, consider this quote from the Android Studio User Guide (emphasis added):
“When building an Android app, it's important that you always test your app on a real device before releasing it to users.”
Building and running a functioning mobile testing lab on your own can be challenging. From keeping up with and managing devices, making sure that developers and testers have the tools and resources they need to deliver on time, and the volume of tests that need to be run, there are a lot of moving pieces and parts.
Also, if you’re working with Appium, building and running your own Appium servers is not only time-consuming and expensive to set up on your own, but slow performance and slow scripting can cause your enterprise mobility team additional stress.
It is also important to remember that when building your own mobile testing lab that devices are typically only dedicated to functions like test automation. While for manual testers, the testing devices are still in testers’ and developers’ hands to keep up with and manage.
By choosing not to build your own mobile testing lab and implementing a device cloud solution your team solves all the tough challenges like Appium server setup. The team can also use the same devices for manual and automated testing from the device cloud where the devices are always online and available. With so many existing solutions available, why rebuild the wheel when you don’t have to?
Just don’t do it. Seriously.
By avoiding these five mistakes, your mobile development and testing teams can rest assured that embarking on the continuous testing journey can be as painless as possible. With the right tools, support and methodology in place you safeguard your processes from unnecessary conflict and difficulties that could prevent your team from doing what they do best – providing superior mobile experiences.
Want to learn more about building a successful continuous delivery pipeline? You can download our latest eBook on the subject here.