Categories
Operational Excellence

Improving the efficiency of your Field Service Workforce

Leverage machine learning to eliminate blind dispatches and improve the first-time fix rate (FTFR)

Field Technicians are the face of your service organization, and it is imperative to equip them with the right tools and knowledge to handle any field challenges. With efficient management and empowerment of technicians, your organization can deliver fast, effective, and efficient services to customers.

A business should strike a balance between the speed and accuracy of on-site customer requests to increase the productivity of technicians and improve customer satisfaction. But, in reality, technicians are frequently not able to deal with customer problems on time and are forced to make multiple trips to the client location due to process inefficiencies. Thus, instead of servicing new customers or optimizing current customer relationships, technicians invest valuable time and resources in non-revenue-generating activities.

Today, 70% of field technicians visit sites without prior information about the nature of the problem, issue location and solution recommendation. It leads to repeated dispatches, longer resolution time and high customer churn.

Going digital is the cornerstone of success for a modern services organization. Adopt the ‘AI-Powered Field Service Framework’ to optimize field services and increase technician productivity. The framework encompasses three vital components to achieve a higher First Time Fix Rate (FTFR) and reduce Mean Time to Resolve (MTTR):

  • Fault Location Classifier– Predicts the fault location and sends email/SMS notification via mobile app to technicians
  • Recommendation Engine– Suggests guided actions and next best resolution steps to improve technicians’ efficiency
  • Technician Dashboard– Provides a one-stop view of all dispatches and actionable insights to technicians


70% of field technicians visit the sites without prior information about the problem leading to repeated dispatches, longer resolution time and high customer churn.

Categories
Digital Customer Experience Insights

Using AI to understand how your customers feel

Predict Net Promoter Scores and identify whether your customer is potentially a promoter, neutral, or detractor. Take corrective actions timely to improve customer service.

Customers today expect a seamless and hassle-free interaction with their service providers. A dissatisfied and frustrated customer will quickly opt to switch. Thus, for the service provider, it becomes very crucial to understand the customer experience and promptly take corrective measures if it lags. One key metric to understand this is using the Net Promoter Score (NPS). It provides customer loyalty and satisfaction measurement by asking customers how likely they are to recommend your product or service to others on a scale of 0-10.

To capture NPS, service providers share the survey forms with their customers. But do customers respond to such surveys? Research shows that only 15-20% of customers respond to the NPS survey after their interactions with customer support. Does it mean the service provider should not take any action for the remaining 80-85%, assuming they would have a good experience? There is a high possibility that a customer not satisfied with the service would have already decided to opt out without taking any effort to respond to the survey.

Most innovative service providers are trying to address this problem with a machine learning (ML) approach.

Fig: Key steps towards building ML Model for CSAT Prediction and Improvement


NPS provides customer loyalty and satisfaction measurement by asking customers how likely they are to recommend your product or service to others

Categories
Operational Excellence

Turn your network issues into customer delight

Leverage automation strategies to streamline the Trouble to Resolve (T2R) process, providing customers with quick resolution and greater satisfaction

TM Forum, a global industry association for service providers and their suppliers in the telecommunications industry, has a business process framework -eTOM’s (Enhanced Telecom Operations Map) Trouble to Resolve (T2R) process. It reveals how to deal with a trouble (problem) reported by the customer, analyze it to identify the root cause of the problem, initiate resolution to meet customer satisfaction, monitor progress and close the trouble ticket.

Most Service Providers follow the eTOM T2R process, however, they encounter key challenges that affect the overall T2R operational efficiency and increase the OPEX.

  • Multiple siloed systems to complete a network event’s lifecycle leads to high manual effort and increased OPEX
  • Difficulty in identifying the right impact of a network event-
    • No proper tools for auto-identification & prioritization of critical events that would cause major business impact
    • Resource wastage: Network Operation Centre (NOC) tends to spend a significant amount of time handling huge volumes of
      alerts
  • Difficulty in meeting business KPIs due to unavailability of fully integrated systems and automated processes

Service Providers in the connectedness industry must develop an effective strategy for integrating the systems and bringing end-to-end automation to the T2R process flow. The majority of service providers have a basic level of automation, however, there is a huge scope for complete lifecycle automation. This Insight showcases an effective approach for implementing end-to-end automation of network event lifecycle from event creation to resolution. The approach is based on the implementation experience of leading service providers at multi-geographic locations.

“According to a report by McKinsey, many service providers have complex fundamental processes with multiple system integrations and are labor-intensive and costly. Leveraging digital technologies to simplify and automate operations makes them more productive and results in a significant cost reduction of up to 33%.”

Categories
Operational Excellence

Steering data migration, powered by RPA

Leverage RPA based Automation Framework to accelerate data migration and improve accuracy

Data migration involves moving data between locations, formats, and applications. This need is on the rise due to ongoing trends such as mergers and acquisitions (M&As), migration of applications to the cloud, and modernization of legacy applications. However, the execution of data migration using traditional methods is not at par with the increasing frequency!

According to Gartner, 50% of the data migration initiatives will exceed their budget & timeline by 2022 because of flawed strategy & execution. Most of the service providers in the connectedness industry adopt the traditional approach for data migration that involves three broad steps: migration planning & preparation, establishing governance, and execution.

Service providers follow the fundamental extract, transform, load (ETL) data migration execution methodology, which is full of challenges. It entails high cost and time due to mock runs and testing for each module. Moreover, it involves manual efforts, which leads to a lot of re-work due to errors and causes fallouts due to data integrity issues. Also, ramping up and down the teams is difficult.

To overcome these challenges, an RPA based automation framework for data migration execution could be an effective approach. The framework encompasses components such as:

  • Smart processor: Identifies data quality & integrity issues in the source data at a very early stage
  • Automation bot: Performs migration/upgrade by extracting & updating data at various layers of the application
  • Fallout management mechanism: Automates the fallout handling, i.e., Fix data quality & integrity issues in CRM, inventory systems, etc.

” According to Gartner, 50% of the data migration initiatives will exceed their budget & timeline by 2022 because of flawed strategy & execution.”

Categories
Software Intensive Networks

Redefining Virtual Network Function (VNF) Testing

Creating an effective and portable VNF testing framework with end-to-end automation

More service providers in the connectedness industry are rapidly introducing new services online, to enhance customer satisfaction and provide better service. However, service providers face several testing challenges in terms of scope, complexity, and frequency during the delivery process. Almost 40% of the service provider’s time and efforts are consumed in testing activities. Virtual Network Function (VNF) Testing in the connectedness industry has become complex, costly and time- consuming. To address the testing challenges faced by service providers in virtualized environments, existing testing methods and processes must be revised and shifted to a new testing model.

This Insight encompasses the key elements that can help service providers to create an effective and portable VNF testing framework with end-to-end automation. The key benefits include rapid test development and maintenance, faster product launch, improved product quality, and differentiated product delivery.


Accelerate service rollout time by 78% with redesigned VNF testing framework.

Categories
Cloud Insights

Prevent your data lake from turning into a data swamp

Build a light-weight efficient data lake on Cloud

The future of Service Providers will be driven by agile and data-driven decision-making. Service Providers in the connectedness industry generate data from various sources every day. Hence, integrating and storing their massive, heterogeneous, and siloed volumes of data in centralized storage is a key imperative.

The demand for every service provider is a data storage and analytics solution of high quality, which could offer more flexibility and agility than traditional systems. A serverless data lake is a popular way of storing and analyzing data in a single repository. It features huge storage, autonomous maintenance, and architectural flexibility for diverse kinds of data.

Storing data of all types and varieties in central storage may be convenient but it can create additional issues. According to Gartner, “80% of data lakes do not include effective metadata management capabilities, which makes them inefficient.” The data lakes of the service providers are not living up to expectations due to reasons such as the data lake turning into a data swamp, lack of business impact, and complexities in data pipeline replication.

Categories
Product Engineering

Move to technology-driven smart policing

Leverage predictive analytics to reduce crimes and burglaries by 30%

Today, the crime rates in most parts of the world are high, despite taking necessary measures. Reports by FBI reveals, “3.9% increase in the estimated number of violent crimes and a 2.6% decrease in the estimated number of property crimes when compared to 2014.” Due to this, the police forces globally are under tremendous pressure to leverage technologies such as predictive analytics, to draw insights from the vast complex data for fighting the crimes. It not only helps in preventing robberies and burglaries but also aids in better utilization of the limited police resources.

Fig. Predicting crime by applying analytics on data feeds from various sources

As per studies conducted by the University of California, crime in any area follows the same pattern as the earthquake aftershocks. It is difficult to predict an earthquake, but once it happens, the following ones are quite easy to predict. The same is applicable when it comes to crimes in any geographical area. Combinational analysis of the past crime data and other influencing parameters help in predicting the location, time, and category of crime.


With the increasing crime rates, globally the police forces are under tremendous pressure to leverage technologies such as predictive analytics to draw insights from the vast complex data for fighting crimes.

Categories
Software Intensive Networks

Making smarter network investment decisions

Build an open-source network capacity planning framework to accelerate the network decisions by 3X

The network complexity of the service providers in the connectedness industry is ever increasing. The introduction of 5G on top of legacy 2G and 3G networks, coupled with increasing customer expectations, imposes tremendous pressure on the service providers. They often struggle with complex networks, dissimilar data, and inefficient visualization of the inventories, which impacts network capacity planning. Some of the main challenges in conventional network capacity planning are:

  • Increasing inefficiencies in network planning due to rapid network expansions and complex networks
  • Difficulty in visualizing and monitoring the networks and their components due to the rapid expansion of networks
  • Difficulty in consolidating data, as the service provider’s network inventory data is scattered and retrieved from different types of vendor network equipment
  • Increased licensing, hardware, and customization costs for the service providers who use COTS products for network visualization

These challenges impact the service provider’s operations leading to ineffective network capacity planning, delays in new network design and rollout, and inefficient network and resource utilization.

While many service providers depend on COTS products to address these challenges in the unified visualization and capacity planning process, it is recommended to consider an open-source approach that enables efficient and cost-effective capacity planning.

Fig. Build an open-source framework to ease and accelerate network capacity planning decisions


The introduction of 5G on top of legacy 2G and 3G networks, coupled with the increasing customer expectations imposes tremendous pressure on the service providers.

Categories
Digital Customer Experience

Making of an intelligent Virtual Agent to transform Customer Experience

Leverage a Machine Learning-based approach for optimizing Virtual Agent Training to improve its precision, recall and accuracy

Virtual agents are an integral part of businesses with an online presence as they provide round-the-clock assistance to customers. They augment teams to enable a rich experience for both customers and live agents.

The success of any Virtual Agent (VA) depends on its Natural Language Understanding (NLU) training, which should not be just a one-time activity before the configuration, but a continuous process. The challenge is to provide the right set of representative examples from historical data for the training. Identifying a few hundred sample examples from millions of historical data is a herculean task. Moreover, this task is often done manually. Thus, the task of finding the most suitable examples becomes questionable as well as extremely time-consuming.

Service Providers must develop a Machine Learning (ML)-based tool to identify the most appropriate and small data set of representative examples for training. The examples cover maximum scope for the respective intent, making NLU training highly efficient with improved precision, recall and accuracy. The ultimate benefit of this is improved customer experience, containment, and reduced abandonment. Since this is a tool-based approach, it also saves a lot of time in comparison to the manual process of identifying the training examples. Improved training efficiency in the first go also saves time and effort during subsequent re-training.

Figure 1: ML-based intent analyzer tool


VAs often fail to satisfy the customers due to their inability to identify the right intent. And this is the effect of wrong or inadequate training of VA’s natural language understanding (NLU) engine.

Categories
Digital Customer Experience Insights

Intelligent automation – Combining RPA and AI to provide a delightful customer experience

Robotic process automation (RPA) in the Connectedness industry has mostly been leveraged for the automation of backend processes. But by integrating RPA with artificial intelligence (AI), service providers can expand its horizons to digital customer care initiatives. The implementation of an AI-enabled RPA platform would help service providers to deliver delightful customer experiences to millions of customers.

The following video insight shows how any service provider can integrate natural language processing, virtual assistants, and diagnostic tools with RPA solutions to provide digital omnichannel customer care. The benefits of intelligent automation include a 50-60% reduction in store visits, a 10-15% increase in NPS and cost savings in millions.


AI-enabled RPA platform will help service providers to deliver delightful customer experiences to millions of customers.

Original content- Video Insight: Providing delightful customer experience using AI-enabled Robotic Process Automation and digital care