Categories
Software Intensive Networks

Redefining Virtual Network Function (VNF) Testing

Creating an effective and portable VNF testing framework with end-to-end automation

More service providers in the connectedness industry are rapidly introducing new services online, to enhance customer satisfaction and provide better service. However, service providers face several testing challenges in terms of scope, complexity, and frequency during the delivery process. Almost 40% of the service provider’s time and efforts are consumed in testing activities. Virtual Network Function (VNF) Testing in the connectedness industry has become complex, costly and time- consuming. To address the testing challenges faced by service providers in virtualized environments, existing testing methods and processes must be revised and shifted to a new testing model.

This Insight encompasses the key elements that can help service providers to create an effective and portable VNF testing framework with end-to-end automation. The key benefits include rapid test development and maintenance, faster product launch, improved product quality, and differentiated product delivery.


Accelerate service rollout time by 78% with redesigned VNF testing framework.

    Authors:
  • Pratik Ravindra Tidke
  • Sumit Thakur
Categories
Cloud Insights

Prevent your data lake from turning into a data swamp

Build a light-weight efficient data lake on Cloud

The future of Service Providers will be driven by agile and data-driven decision-making. Service Providers in the connectedness industry generate data from various sources every day. Hence, integrating and storing their massive, heterogeneous, and siloed volumes of data in centralized storage is a key imperative.

The demand for every service provider is a data storage and analytics solution of high quality, which could offer more flexibility and agility than traditional systems. A serverless data lake is a popular way of storing and analyzing data in a single repository. It features huge storage, autonomous maintenance, and architectural flexibility for diverse kinds of data.

Storing data of all types and varieties in central storage may be convenient but it can create additional issues. According to Gartner, “80% of data lakes do not include effective metadata management capabilities, which makes them inefficient.” The data lakes of the service providers are not living up to expectations due to reasons such as the data lake turning into a data swamp, lack of business impact, and complexities in data pipeline replication.

    Authors:
  • Manoj Kumar
  • Sriram V
  • Sathya Narayanan
Categories
Product Engineering

Move to technology-driven smart policing

Leverage predictive analytics to reduce crimes and burglaries by 30%

Today, the crime rates in most parts of the world are high, despite taking necessary measures. Reports by FBI reveals, “3.9% increase in the estimated number of violent crimes and a 2.6% decrease in the estimated number of property crimes when compared to 2014.” Due to this, the police forces globally are under tremendous pressure to leverage technologies such as predictive analytics, to draw insights from the vast complex data for fighting the crimes. It not only helps in preventing robberies and burglaries but also aids in better utilization of the limited police resources.

Fig. Predicting crime by applying analytics on data feeds from various sources

As per studies conducted by the University of California, crime in any area follows the same pattern as the earthquake aftershocks. It is difficult to predict an earthquake, but once it happens, the following ones are quite easy to predict. The same is applicable when it comes to crimes in any geographical area. Combinational analysis of the past crime data and other influencing parameters help in predicting the location, time, and category of crime.


With the increasing crime rates, globally the police forces are under tremendous pressure to leverage technologies such as predictive analytics to draw insights from the vast complex data for fighting crimes.

    Authors:
  • Avaiarasi S, Director – Delivery (IoT)
  • Kamakya C, Project Manager – IoT
  • Sarvagya Nayak, Business Analyst – Insights
Categories
Software Intensive Networks

Making smarter network investment decisions

Build an open-source network capacity planning framework to accelerate the network decisions by 3X

The network complexity of the service providers in the connectedness industry is ever increasing. The introduction of 5G on top of legacy 2G and 3G networks, coupled with increasing customer expectations, imposes tremendous pressure on the service providers. They often struggle with complex networks, dissimilar data, and inefficient visualization of the inventories, which impacts network capacity planning. Some of the main challenges in conventional network capacity planning are:

  • Increasing inefficiencies in network planning due to rapid network expansions and complex networks
  • Difficulty in visualizing and monitoring the networks and their components due to the rapid expansion of networks
  • Difficulty in consolidating data, as the service provider’s network inventory data is scattered and retrieved from different types of vendor network equipment
  • Increased licensing, hardware, and customization costs for the service providers who use COTS products for network visualization

These challenges impact the service provider’s operations leading to ineffective network capacity planning, delays in new network design and rollout, and inefficient network and resource utilization.

While many service providers depend on COTS products to address these challenges in the unified visualization and capacity planning process, it is recommended to consider an open-source approach that enables efficient and cost-effective capacity planning.

Fig. Build an open-source framework to ease and accelerate network capacity planning decisions


The introduction of 5G on top of legacy 2G and 3G networks, coupled with the increasing customer expectations imposes tremendous pressure on the service providers.

    Authors:
  • Murugan Chidhambaram
  • Hari Ganesh
  • Priyanka Ravindran
  • Neha Sehgal
Categories
Digital Customer Experience

Making of an intelligent Virtual Agent to transform Customer Experience

Leverage a Machine Learning-based approach for optimizing Virtual Agent Training to improve its precision, recall and accuracy

Virtual agents are an integral part of businesses with an online presence as they provide round-the-clock assistance to customers. They augment teams to enable a rich experience for both customers and live agents.

The success of any Virtual Agent (VA) depends on its Natural Language Understanding (NLU) training, which should not be just a one-time activity before the configuration, but a continuous process. The challenge is to provide the right set of representative examples from historical data for the training. Identifying a few hundred sample examples from millions of historical data is a herculean task. Moreover, this task is often done manually. Thus, the task of finding the most suitable examples becomes questionable as well as extremely time-consuming.

Service Providers must develop a Machine Learning (ML)-based tool to identify the most appropriate and small data set of representative examples for training. The examples cover maximum scope for the respective intent, making NLU training highly efficient with improved precision, recall and accuracy. The ultimate benefit of this is improved customer experience, containment, and reduced abandonment. Since this is a tool-based approach, it also saves a lot of time in comparison to the manual process of identifying the training examples. Improved training efficiency in the first go also saves time and effort during subsequent re-training.

Figure 1: ML-based intent analyzer tool


VAs often fail to satisfy the customers due to their inability to identify the right intent. And this is the effect of wrong or inadequate training of VA’s natural language understanding (NLU) engine.

Categories
Digital Customer Experience Insights

Intelligent automation – Combining RPA and AI to provide a delightful customer experience

Robotic process automation (RPA) in the Connectedness industry has mostly been leveraged for the automation of backend processes. But by integrating RPA with artificial intelligence (AI), service providers can expand its horizons to digital customer care initiatives. The implementation of an AI-enabled RPA platform would help service providers to deliver delightful customer experiences to millions of customers.

The following video insight shows how any service provider can integrate natural language processing, virtual assistants, and diagnostic tools with RPA solutions to provide digital omnichannel customer care. The benefits of intelligent automation include a 50-60% reduction in store visits, a 10-15% increase in NPS and cost savings in millions.


AI-enabled RPA platform will help service providers to deliver delightful customer experiences to millions of customers.

Original content- Video Insight: Providing delightful customer experience using AI-enabled Robotic Process Automation and digital care

Categories
Operational Excellence

Go Beyond RPA to Speed Up Transaction Processing Time

Leverage effective continuous improvement techniques to achieve a high straight-through processing rate

Straight-through processing (STP) refers to the automated processing of transactions without manual intervention. Transaction processes are usually multi-staged, requiring multiple people across different departments and sometimes even involving paper checks. Companies often adopt RPA as a one-time solution to complete transactions and achieve a high STP rate. But is it really effective?

The estimated STP rate for any service provider in the connectedness industry is 75%-85%. However, the actual realization is only 30%-50%. One of the reasons that has contributed to the average rate is implementation of only RPA by service providers. Other widely used continuous improvement techniques like occasional continuous improvement and analytics-driven continuous improvement have proven to be less effective to achieve the targeted STP rate. Service providers must adopt effective continuous improvement methods to get more value from their existing RPA implementation.

Adopt the Automation Optimizer Framework, an efficient continuous improvement strategy to improve your STP rate. The framework identifies automation inefficiencies, root causes, and solutions for the identified gaps and continuously monitors the STP rates- all in an automated manner. Its key components are:

  • Intelligent RCA (Root-cause analysis) Engine: Drills down to transaction-level information to automatically identify the root-cause for fallout
  • Integrated Solutionizer: Constantly analyzes the output from an Intelligent RCA Engine and triggers respective action based on the identified root cause
  • Continuous Monitoring Tool: Tracks the STP rate progress over time for the defined objectives, KPIs and milestones


The estimated STP rate for any service provider is 75%-85%, however, the actual realization is only 30%-50%. Only RPA implementation will not suffice if the STP rate has to be improved.

Categories
Operational Excellence

Giving wings to your standard RPA bots

Combine the power of RPA with NLP to improve the automation potential of service provisioning

Most service providers in the Connectedness industry have started leveraging Robotic Process Automation (RPA) to automate various processes, especially in service provisioning. However, the standard RPA bot alone cannot automate the end-to-end provisioning process, as it involves a lot of unstructured data that requires manual intervention for processing. According to Gartner, “Today, 80% of enterprise data is unstructured”. Processing such a huge amount of unstructured data and performing end-to-end automation with a standard RPA is a major challenge for service providers.

To overcome this challenge, service providers can combine the power of RPA bot with a Natural Language Processing (NLP)-based engine capable of extracting information and processing the unstructured text. It further helps in deriving insights and providing the next best action, all in an automated way. This end-to-end automation helps the service providers to reduce the cycle time and provide efficient services to their customers.


According to Gartner, “Today, 80% of enterprise data is unstructured”. Standard RPA alone cannot process such a huge amount of unstructured data and perform end-to-end automation.

    Authors:
  • Madhusudhanan S
  • Velmurugan M
  • Gurunath L V
  • Mogan A.B.

Categories
IT Agility

Fix the broken dispatch process to improve field service

Spare location intelligence can enable efficient dispatch operations and reduce the issue resolution time by 45%

Most service providers are struggling with the rising cost of field service due to the increase in repeat dispatches and higher issue resolution time. It takes longer to repair faulty hardware, impacting the customer experience and leading to a higher churn probability. According to Forrester, 73% of customers consider time the most critical customer service point.

Spare parts information is critical to scheduling an efficient dispatch for repair activities. It is not only the right delivery of spare parts that matters but also the delivery should reach the right place at the right time. Getting this right, the first time is difficult as the current manual approach is error-prone and inefficient without any automation. With a wide gap between the availability of spare parts and onsite requirements, there is a high degree of unpredictability.

Fig:  Steps followed in field service operation showcasing the importance of spare information

To provide an efficient dispatch, it is crucial to ensure the right spare parts are available and dispatched from the nearest warehouse location through the most optimized route. The Spare-Location Intelligence framework can enable service providers to get real-time spare availability across warehouse locations and the most optimized route to access them.


The Spare-Location Intelligence framework can enable service providers to get real-time spare availability across warehouse locations

    Authors:
  • Sumit Thakur
  • Murugan Chidhambaram
Categories
IT Agility

Eliminating avoidable truck rolls to save costs and improve customer satisfaction

Leverage statistical analysis and service truck roll optimizer for better field service efficiency

Truck rolls are an integral part of field service operations. It refers to any situation in which a technician is dispatched to solve an issue. But often, a field technician is dispatched to a job that is temporary in nature or can be resolved in under five minutes with a quick fix. Situations like these have become a huge pain point for service organizations.

A peculiar challenge faced by most Service Providers is that 25% of the truck rolls in their fleet are deemed non-value-add (NVA) or avoidable, costing millions each year. The process of creating truck roll appointments and subsequent follow-up activities still involves a lot of manual work. The customer service representatives (CSRs) create work orders manually without thorough assessment, resulting in avoidable or non-value-added truck rolls.

Service Providers must adopt a proven approach to address truck roll inefficiency issues. Transform field service management processes using techniques like:

  • Service truck roll optimization– Filters the NVA truck rolls and updates the work logs in the CRM system. Only the work orders that do not meet the business filter criteria will result in truck rolls
  • Pareto principle and correlation analysis– Uses analytics to identify the main causes for NVA truck rolls


25% of the truck rolls in a service provider’s fleet are deemed non-value-add (NVA) or avoidable, thus costing millions each year.