Throw a group of developers, software engineers, testers and designers together for some time and see how they come up with innovative and excellent ideas. This scenario is readily adaptable for TestAutothons, as well. Brainstorming ideas to hack their way through the given assignment is a common situation at any TestAutothon. Here, you will see and hear about our experience on how the no coding approach of our Intelligent Test Automation Tool, JiffyTEST helped us in solving the problem scenario presented to us.
For the Step- In Challenge – TestAutothon 2018, Option3 was given the task to complete the below test flow:
What We were Expected to Achieve
What we had to do was automate the testing scenario to ensure that data matched across two websites. However, the testing scenario was not all that simple as it appeared. This was what was expected of us:
We had to Google search for 20 movies, extract the Wikipedia and IMDB page links of the movie, extract the name of the director from Wikipedia page as well as IMDB link and finally validate that the director’s name given is matching in both the websites. To add on to the complexity this was to be achieved either through a GUI call or an API call and we were expected to generate a custom HTML report with the details. We also had to give a snapshot of the task that we created and the final report.
Challenges that We had to Go Through
We had quite an exciting and challenging time, trying to match up to the expectations. We were required not just to read the names of the movie but had to match it with the number mentioned in the list. Also, the steps 1 and 2 for Google search of all the 20 movies were to be done simultaneously as one step before the commencement of the test and was required to populate an in-memory data structure.
The requirement was to implement one test, one script or method to take the movie name and relate it with Wikipedia URLs as the data. A significant challenge was that this test had to be run at the GUI as well as at the HTTP layer, but the test code had to remain the same for both cases. Meanwhile, the expectation was that at least one GUI layer test could be run on a mobile device.
The Method Used to Achieve This
The first step was to input movie names. We used CSV node in JiffyTEST so that it was possible to iterate 20 input records, which included Movie id, Movie name and Mode of automation (HTTP, GUI, Mobile). Next, for Google search and extract wiki link, we used the Web UI node of the JiffyTEST, so that the wiki link would be captured correctly, irrespective of the order of the results of the Google Search.
Then to get the name of the director and the IMDB URL for each movie, based on the mode of operation in the CSV node, the idea was to apply either the Jiffy Web UI node, Rest API node or the Mobile UI node. In case of any diversion in the automation flow, it could be handled using the Jiffy IF node.
We then repeated the process based on the mode of operation in the CSV node, to extract the director’s name from the IMDB URL. Then, using the Jiffy Validator node or IF node, we authenticated if the director’s name captured from the IMDB page is the same as that from the Wikipedia page.
From the beginning, we made sure that all the GUI part was handled by the Engineer 1, while the rest of the API calls were managed by SME 1. For the input file, multiple threads were handled by Engineer 2, and for report generation encompassing the entire solution and overall coordination, SME 2 was responsible. Finally, the whole integration aspects and team management were done by the lead engineer.
How did we Use the Features of JiffyTEST in these Testing Scenarios?
JiffyTEST was highly beneficial for the given testing scenarios because the requirement was for parallel execution of iterations across all available active services. This was possible with the current set up of JiffyTEST, where parallel executions of iterations could be made. The feature that we showcased, JSM module of JiffyTEST could reflect this function. The JSM UI also configured scaling of threads.
Another feature we showcased includes the JiffyTEST JDI UI portal that enabled us to design a report capturing the mode of operation and final execution status of each iteration for the reference of the user. The screen capture feature of JiffyTEST was also used to capture the screenshots from Wiki and IMDB for GUI and map these results to the report.
Key Features of JiffyTEST that we Highlighted
- UI Automation- an Inbuilt Automation Technique for UI or HTTP
- Parallel Executions- parallel iterations and multiple threading is possible
- Zero-Scripting feature- this ensures that there is no need to execute any HTML code for any of the testing scenarios
Results we Delivered
We could achieve the requirements within the given time as expected using a single test case. All the iterations were handled with the available features of JiffyTEST itself. The main highlight of the tool that we were able to showcase was that it would work for the end user without the need to write any expression.