SUCCESS STORIES / Product Engineering

Net visual bugs easily with AI-driven test automation

AI is helping the company replace traditional visual user interface testing methods that are error-prone and require resource-heavy processes.

Background

The user interface is the facade of applications, playing the role of key customer touchpoint for any business on the internet – from e-commerce platforms to insurers to stockbrokers. Customer experience is swayed by how seamlessly an interface allows functions like discovery, engagement, and payments.

For a long time, user interfaces have been plagued by visual bugs that evade traditional testing methods. Testers comb through every little block of the UI manually to detect bugs, ensuring it adheres to design and is free of glitches disrupting customer action. The process is tedious, error-prone, and getting increasingly difficult with frequent form factor modifications to the devices we use today. The experience is not the same across smartphone operating systems, too. Additionally, dynamic elements on the UI that move around in response to the user’s activity compound the problem for testing teams.

Gartner research suggests that user experience can influence an application’s success, with 61% of enterprise IT leaders holding that it is a critical metric in application performance monitoring.

How to weed out the inefficiency in UI testing?

menu-img

10X Faster detection of visual bugs

75%reduction in IT overheads for onboarding new markets 

60% improvement in test coverage

Client Situation

Our client is a US-based managed communications services provider offering high-capacity bandwidth across several states in the US. It offers broadband, entertainment, and security services to small and midsize customers. The company runs a mobile app for fiber to the Home customers, enabling them to manage their accounts, pay bills, upgrade, and access the support team.

The company had continued with the traditional UI testing method, leading to less-than-desired customer experience scores. Visual bugs on the user interface had impacted the customer journey on key actions from discovery to order placement to payment, leading to customer dissatisfaction and lost revenue for the company.

AI-driven test automation is helping the company weed out visual UI bugs and offer a consistent user experience on its app.

Diagnosis

In a typical UI design journey, a developer executes the sketch of a designer through coding languages like React, Angular, or other JavaScript libraries to build interfaces. Manual visual testing is then carried out in parallel with other functional evaluations. The laborious process involves reviewing the interface icon by icon, word by word.

Still, user experience is inconsistent across different devices and visual bugs skip through the process. With UI design and development teams adopting “agile sprint” methodologies, manual testing has turned out to be a drag on the aggressive timelines for product release at the company.

The company needed a solution to prioritize the right test cases to automate the test process, achieve higher efficiency, and meet the project timelines set for its app.

Solving It

Prodapt designed a three-pronged solution to help the company spot visual bugs and accelerate timelines. A Natural Language Processing (NLP) based evaluator would prioritize different test cases based on the degree of importance. The program gets textual inputs encapsulating what the test case is about – for example, the webpage where a customer places the order and moves ahead for the payment. Using a Machine Learning (ML) algorithm to determine the degree of importance, the evaluator produces a pecking order of test cases.

After the evaluator throws up the results, a computer vision (CV)-based tool analyses the user interface for visual bugs. Prodapt’s solution offers the flexibility to tweak the degree to which the interface can differ from the original sketch. Sometimes, testers just need to ensure all the components are in place. Other times, the UI needs to be an exact replication of the design – a pixel-to-pixel comparison where not an icon or word falls out of line. The CV tool allows for varying degrees of deviation from design, for several cases including the ones where dynamic/movable content is part of the UI.

Once the tool screens out the bugs, the defects are numbered, labeled, and shared with the developers. The defects are reviewed by an assigned human expert before generating a report to help teams detect them easier the next time.

The AI-driven tool was able to detect visual bugs ten times faster than manual testing and achieved a 75% reduction in the requirement of human resources for visual testing. The tool also improved test coverage by 60%.

Let’s connect

How can we help?

We'd love to hear from you.

Talk to a consultant