One question we frequently get is, “How can I use Trust™ and deviceConnect™ to run multiple copies of a test on many devices at the same time?” Sometimes the requirement is to run the same test on many devices concurrently, and sometimes the requirement is to run multiple tests on many devices concurrently.
The video above shows a set of devices running the same test simultaneously using the concurrent mobile test lab that we are going to discuss.
Mobile Labs has a need to stress test our deviceConnect servers – a process that involves looping a fairly extensive test script through multiple iterations on many mobile devices at the same time. We use a server running a hypervisor, concurrent licensing, and the HP’s Application Lifecycle Management (ALM) product to data-drive, schedule, and run many tests on many devices at the same time.
We think that the lab we have built successfully fulfills the requirement for a concurrent mobile test lab. It consists of a server running multiple virtual machines connected to a deviceConnect appliance containing mobile devices. With the server in place, HP’s ALM product can drive tests onto virtual copies of the company’s Unified Functional Testing (UFT) software and Trust, each connected to an actual mobile device, using a simple data table.
Building the Mobile Test Lab
Here is how our concurrent mobile test lab is built:
We used a deviceConnect cart with 16 devices installed and a laptop with a browser that could connect to an ALM server running on the VMware server. For the server, we assembled a few off-the-shelf components, as follows:
- 1 LSI MegaRAID 9260-8i PCIE RAID controller and battery backup unit
- 4 Seagate Constellation 2TB 7200-RPM SATA disk drives
- 1 Intel Xeon ES-2640 Six-core 2.5 GHz processor
- 1 Supermicro SYS-5027R 2U rack mount Server
- 4 16GB DDR3-1600 ECC registered dynamic RAM
This configuration gave us (with about 10 minutes of assembly time), a six-core, 64GB server with 4TB of RAID-1 protected disk space. The total cost of the server was about $3,900. We connected the server, a laptop, and a deviceConnect cart to a common Ethernet switch, plugged devices into the USB ports on the cart, and our lab was built!
We obtained VMware vSphere Essentials ($560), which consists of the VMware ESXi hypervisor (runs virtual machines on the Supermicro server) and VMware vCenter Server, which provides a Web UI for creating, managing, starting, and stopping virtual machines.
We installed the ESXi hypervisor as the base operating system on the server. Once vCenter server was installed and the Supermicro server was up and running, we began the process of installing the virtual machines that comprise our automation test lab.
To support multiple copies of HP’s UFT 11.5, each with a licensed copy of Mobile Labs’ Trust, we first installed a virtual Windows Server on the Supermicro server. Once it was up and running, we installed the HP License Server, which is able to service concurrent licenses for both Trust and UFT. We then installed a UFT license key that supports 16 concurrent UFT sessions and a Trust license key that likewise supports 16 concurrent sessions. There is no reason that the number of devices should be limited to 16, that is just the number with which we worked. More are certainly possible.
We then began the process of installing 16 windows desktop systems (Windows 7 Professional). We obtained a multiple activation key (MAK) from Microsoft that allowed us to install multiple copies of Windows 7 Professional on our server using a single activation key. The process of building the systems consisted of installing one copy (not activated) with UFT and Trust, and then using the vCenter server to convert that virtual machine to a VMware template. To build 16 virtual machines, we simply cloned the template 16 times. When we fired up each virtual machine, we entered the Windows License key and activated. We had previously configured Trust and UFT to get their licensing from the license server.
VMware made it easy to bridge our LAN from the VMs, and vCenter server even has facilities for automatically starting the virtual machines at certain times of the day and in a certain order. With this work complete, we installed an ALM server in a virtual machine on the Supermicro server. We connected to this ALM server with the laptop in order to drive ALM through its Web user interface.
Setting up the Tests
Setting up the tests consisted of configuring ALM to table-drive a set of hosts and a set of tests. We first defined the fields for tabular data to drive the test(s). We defined the data fields by using the ALM project customizing tool. We added user fields and specified data types so the script could pull down the required parameters:
Here is a screen capture showing part of the ALM data-driving table:
Planning out the Test Actions
We planned out the basic test sequence as follows: ALM selected a line of our table and then selected a host from among the VMs (we could assign individual machines or let ALM pick the next available machine from a group). Once a machine was selected, ALM started our test script on the machine and was available for the test script to load its data values. Each script received the following values from the table:
- The URL to contact deviceConnect (same for all scripts)
- The universally unique identifier (UUID) of the mobile device to use for this test
- The name of the app that deviceConnect should install and start on the mobile device
- The username and password of a deviceConnect user authorized to attach the device for testing.
Once the script made a device connection, it ran a fairly extensive test (with checkpoints) in a loop so that we could observe performance under maximum load. But note that this technique could be used to run any test on any app on any device.
At the beginning of the script, we pulled the values unique to each run with this code:
dcURL = QCUtil.CurrentTestSetTest.Field(“TC_USER_05”)
dcDeviceID = QCUtil.CurrentTestSetTest.Field(“TC_USER_01”)
dcAppID = QCUtil.CurrentTestSetTest.Field(“TC_USER_02”)
dcUsername = QCUtil.CurrentTestSetTest.Field(“TC_USER_03”)
dcPassword = QCUtil.CurrentTestSetTest.Field(“TC_USER_04”)
Each line of code in the above code snippet causes UFT to contact ALM to retrieve the appropriate data for this test run. We used the fields to build a command line to start Mobile Labs’ deviceViewer, connect it to the appropriate device on deviceConnect, install the app, and start the test:
connectParams = “-url airstream://launchApp?” _
& “hubAddress=” & dcURL & “&” _
& “deviceId=” & dcDeviceID & “&” _
& “applicationId=” & dcAppID & “&” _
& “username=” & dcUsername & “&” _
& “password=” & dcPassword & “&” _
We started deviceViewer, made the connection to the mobile device, and installed and started the necessary mobile app all with this one UFT script line:
“C:\Program Files (x86)\Mobile Labs\Trust”
There is a little additional code not shown here that declares the variables and makes a simple check to see if deviceViewer was shut down before.
With the test planned, script written, and code in place in the script to tailor itself to the proper device within deviceConnect, we were ready to run the tests you see in the video shown at the beginning of this post.
Running the Tests
Firing up the tests on as many devices as we wanted was as simple as clicking “run all” from the test list:
We un-checked “run all tests locally” and tried letting ALM chose a VM to host a test from the group “PerformanceVMs” that we created, and we also tried specifying a virtual machine per test. In either case, ALM automatically contacted UFT on each eligible virtual machine and started our test script there. Once the script retrieved the data values from ALM, it was off and running its test on a real mobile device. It took only seconds for ALM to get the full complement of 16 devices up and running at the same time.
ALM gave us a very comprehensive picture of what happened during all of the test runs, including performance data that let us verify that all runs completed within our targeted time limits. The results pane shows the status of each completed test with details from each step. We could see the parameters the script used to connect to deviceConnect, the results of the checkpoints, and the overall pass/fail state of the tests.
Each individual test run could be probed for additional information including start time, end time, verification steps, and overall status.
Conclusion – Well Worth it!
Setting up the concurrent mobile test lab required a little upfront work and minimal initial expense for the server and its software. Once the setup was complete, however, we found that ALM offered us a very powerful and very elegant way to run tests on multiple devices at the same time – meeting most, if not all, of the requirements that we hear from our customers and our prospects for running tests on multiple devices at the same time.
By: Michael Ryan