Enabling Scalable and Rapid Data Product Delivery for Global Financial Services

The Problem

A large financial services organization, serving a global network of high-net-worth clients, faced challenges in efficiently delivering data products across the enterprise. The existing setup hindered rapid data product delivery and limited the ability to effectively scale data initiatives across the business. The organization needed a scalable solution that would:​

How Kenway Helped

Kenway implemented a Delta Lakehouse data platform on Azure Synapse, utilizing a medallion architecture (Bronze, Silver, and Gold layers). This platform enabled the organization to centralize, standardize, and govern data, while keeping data engineering complexity low and facilitating data access for multiple teams.​

  1. Metadata-Driven Ingestion - We developed a metadata-driven ingestion framework processes using data contracts for different source types (flat files, databases, and application events). This minimized the complexity of onboarding new data sources, which now only required configuration updates to integrate additional data rather than code changes.
  2. Data Exploration Layer - A Silver Data Lake layer, accessible via serverless cloud technologies, enabled easy data exploration through familiar tools like SQL Server Management Studio and Power BI. This ensured teams could quickly analyze data without needing highly-specialized skills.​
  3. Project Centric Databases - Serverless databases were provisioned for specific projects, on top of the Silver Data Lake Layer. Teams could tailor and use data as needed while ensuring governance through Source Control backed data contracts. These data contracts enabled change control with familiar practices like branching, versioning, and code reviews.​

Results

This solution significantly reduced the effort needed to onboard new data sources and simplified data engineering. Teams across the organization could independently access and model data, speeding up data product delivery without disrupting each other's workflows. Governance and control over shared data assets were enhanced through established version control processes, ensuring consistency and compliance across the data lifecycle.​

Case Study – Revolutionizing Customer Experience with a Voice Virtual Assistant and Google CCAI

Client Profile

Industry: Telecommunications

Client: Fortune 50 telecom

Solution: Voice Virtual Assistant (VA) Implementation

Partner Role In Project  

When a global telecommunications company wanted to build a new Voice Virtual Assistant leveraging Google's Contact Center AI (CCAI) platform, Kenway partnered with the client to harmonize the innovative capabilities of CCAI with the people, processes, and tools already in place. With Kenway’s help, the client established a platform of cross-functional learning, and Google gained insights into the client’s methodologies while the client received an in-depth understanding of CCAI strategies.

The Challenge

The client sought to improve the customer experience by implementing Contact Center as a Service (CCAI Platform) while upholding their high standards of operational efficiency and service. Their objective was to enable seamless interactions that delivered fast, accurate responses, alleviated the workload on their experts, and reduced operational costs.

While Google CCAI is an excellent out-of-the-box product, additional personalization is necessary to align with internal policies, practices, and service offerings. Balancing consistency in Google's Intelligent Virtual Agent (IVA) with the unique internal requirements is a complex task. This task requires a deep understanding of branding, including language and messaging. It also calls for close collaboration between Google and the client’s teams to integrate CCAI’s features while preserving the uniqueness of the client’s products and services.

The Kenway team worked closely with the client to identify the following goals:

The Solution

Data-driven product design decisions

    Leveraging data plays a pivotal role in enhancing overall user experience. By leveraging feedback loop analysis and existing reporting tools like Power BI and Kibana, Kenway skillfully identified opportunities and pain points across all self-service customer experiences and brought them to Google with the goal of improving the customer experience and driving more calls into self-service. 

    Customer welcome and intent identification

      Before a caller can access the variety of self-service functions, they first encounter a series of questions designed to capture their intent and confirm their account. Kenway recognized how essential it is to accurately identify the purpose of the call from the outset to deliver a tailored experience. By leveraging insights from Google’s data, Kenway optimized these initial prompts to feel natural and intuitive, creating an efficient, almost human-like dialogue that keeps callers engaged longer and encourages them to use self-service options.

      NLU tuning

        By quantifying the most common customer utterances, Kenway helped customize the Natural Language Understanding (NLU) model. These recommendations included giving customers more time to respond, slowing down prompts, and understanding specific terms, phrases, or accents unique to the client. As a result, the NLU model has significantly improved containment rates and enhanced accuracy in interpreting customer inputs, enriching the overall user experience. This analysis is conducted program-wide as customer behaviors and preferences evolve, utilizing a robust, customizable NLU model that adapts to fit specific needs related to products, services, and language.

        Customer experience and journey mapping 

          Tying the customer experience to an overall customer journey strategy is essential to enhanced Customer Experience (CX) decision-making. It’s crucial to the customer journey enhancements to minimize excessive questioning that can frustrate callers. By ensuring that questions are relevant and leveraging known customer information from past interactions, the IVA can more effectively guide callers through the system without overwhelming them. This contextual awareness allows the system to make informed decisions about a caller’s intent, reducing the need for clarification and streamlining the overall experience.

          The Results

          Significant Progress

          With these enhancements, the client’s Google CCAI platform made significant strides in success metrics and self-service containment rates, currently handling over 25 functions, such as payments, troubleshooting, and billing, while servicing more than 1 million customers daily.

          Data-Driven Refinements 

          The rapid development of call functions is based on continuous refinement cycles using data from real callers. The team designed an efficient call flow by carefully analyzing NLU and customer inputs that adapt to evolving consumer needs. This includes reducing unnecessary prompts, leading to increased caller engagement and a seamless experience.

          Trust and Cost Efficiency

          While serving a larger number of callers can generate direct cost savings, the greater benefit lies in the enhanced trust between our client and its customers. By implementing an efficient system that effectively serves customers, both parties are building a reliable foundation for future interactions.

          Conclusion

          The client’s successful implementation of Google Contact Center AI, guided by Kenway Consulting, highlights the critical role of strategic collaboration, specialized technical knowledge, and data-driven insights combined with deep business knowledge. This partnership has led to notable improvements in operational efficiency, enhanced customer experiences, and innovative service delivery.

          Are you ready to elevate your customer interactions with an effective Voice Virtual Assistant strategy? Contact us today to discover how CCAI can transform your service capabilities!

          Case Study – Streamlining Success: Transforming Transfer Agency Processes 

          Client Profile 

          Background 

          As a leading player in the financial services industry, the client has a reputation for delivering exceptional custodial services. Within their portfolio lies their critical role as a transfer agent, supported by specialized teams including: 

          Each time the client acquires new business, their transfer agency team undertakes the intricate process of onboarding and converting services from the previous custodian. This transition demands immense coordination and precision, often straining resources and impacting downstream processes. 

          When the client secured a significant new business, they quickly faced unprecedented challenges, including higher-than-expected volumes and unique client demands. 

          Challenges 

          The onboarding process for the new business revealed several operational hurdles: 

          The situation threatened to impact client commitments and relationships, requiring immediate intervention. 

          Approach and Solution 

          With Kenway’s help, the client reimagined their transfer agency processes. Together, we developed a scalable operating model that empowered the client to drive successful change in their organization. 

          We embarked with a comprehensive review of the client’s current state in a timely and efficient manner that drove definition of roadmap for future improvements: 

          1. Current and Future State Analysis: 
            • Conducted SME interviews across all teams to map existing workflows. 
            • Standardized over 300 procedure documents to streamline operations. 
            • Facilitated tabletop discussions to uncover inefficiencies and align team processes. 
            • Recommended prioritized improvements with measurable ROI. 
          1. Future State Documentation: 
            • Created end-to-end process flows for transfer agency subgroups, enhancing transparency and accountability. 
            • Developed a conversion runbook, ensuring consistency and quality in future client transitions. 
            • Added new procedure documents, user guides, and client reporting templates to close documentation gaps. 
          1. Change Management Execution: 
            • Established a Resolutions Team and Quality Assurance Team to address critical gaps, supported by detailed onboarding materials. 
            • Built the Transfer Agency Knowledge Center on SharePoint, transforming a difficult to maintain shared drive into a centralized repository for streamlined knowledge management and easy self-service. 

          Results and Impact 

          The partnership between the client and Kenway delivered tangible, lasting results: 

          A Partnership for Lasting Success 

          Through close collaboration, the client demonstrated their ability to adapt and innovate in the face of challenges. By leveraging Kenway’s Product and Program Management expertise, operational inefficiencies were turned into opportunities for growth. 

          The client’s feedback speaks to the power of this partnership --"Amazing, a model for what a lot of teams should have." 

          The redesigned processes and tools allowed the client to ensure delivery commitments were met, creating tangible results for the servicing of their client and the management of business operations.  

          Ready to Transform Your Business? 

          Partner with us to drive innovation, streamline processes, and achieve results tailored to your unique needs. Contact us today to see how we can help you succeed. 

          Case Study – Customer Proprietary Networking Information Data Visibility and Business Intelligence

          How Kenway Consulting Helped a Fortune 50 Telecom Provider Enable Data Visibility into Their Customer Election History and Develop Interactive Reports for Internal Investigations and Reconciliation

          Client Profile:

          Background:

          In today’s data-driven landscape, data visibility and Business Intelligence (BI) reporting are  crucial for complying with increasingly stringent data privacy regulations. Publicity around security issues is gaining an increased level of attention. Enterprises must prioritize compliance with these mandates to avoid significant financial and reputational repercussions. Penalties for non-compliance can reach millions of dollars, with additional daily fines for ongoing violations. For corporations managing vast amounts of customer and/or employee data, achieving and maintaining compliance can be a complex undertaking. 

          The Problem

          A client's lack of data visibility into their extensive customer base exposed them to potential fines from the FCC for non-compliance with Customer Personally Identifiable Information (PII) and Consumer Proprietary Network Information (CPNI) regulations. Non-compliance can result in hefty fines reaching upwards of $80 million, jeopardizing the bottom line. Limited visibility into data storage systems and the broader IT landscape hampered the client's ability to ensure compliance and address system issues promptly. Without this enhanced observability, assessing and ensuring compliance was a guessing game or sheer luck.   

          The Challenge:

          To stay compliant with data privacy regulations and prevent security risks, organizations must respond swiftly and accurately to data privacy requests. Achieving this requires a deep understanding of their data flows and the root causes of any internal failures—whether in systems, technology, or processes. However, gaining this level of insight is a major challenge.

          Maintaining an enterprise-level reporting system that ensures data accuracy, tracks breaches, monitors system performance, and resolves issues within mandated timeframes involves the coordination of multiple teams and departments. Database owners were so focused on addressing production-grade problems they didn’t have the capacity to build out reporting dashboards for the business. As a result, the business did not have data to provide visibility into the area of focus. 

          The Solution

          With pre-existing knowledge of CPNI industry-specific regulations and FCC guidelines, Kenway was well positioned to ensure customer opt-in/opt-out preferences were accurately captured in internal systems, especially in complex data environments where customer preferences across multiple lines of business were being maintained in numerous source systems.

          Kenway identified these appropriate backend source systems and engaged the database administrators to accurately ingest data into Power BI. Additionally, Kenway wrote custom scripts to parse through a system fallout mailbox to load that data into the business intelligence tool to reconcile against the backend data sources. The team delivered reliable, real-time reporting to the client team with a dashboard to address their gaps, enabling leadership to promptly identify finable offenses and correct within the mandated timeframes. The team built current and future state system mappings which supported the client’s approach to proactively reduce defects before they occur. 

          The Approach:

          1. Define and confirm existing challenges and gaps.
          2. Partner with business stakeholders to document requirements for the desired target state.
          3. Groom requirements and prioritize a backlog of user stories.
          4. Align delivery expectations with stakeholders.
          5. Conduct discovery on the existing technical ecosystem and data sources.
          6. Identify data and system-related defects and coordinate with appropriate teams to address.
          7. Collaborate with database owners to understand key data sources and tables for each reporting use case. Understand and document the underlying logic behind each important field.
          8. Ingest all relevant data sources in Power BI and start modeling in Power BI.
          9. Iteratively build reports, demo, collect feedback, enhance, test, and deploy.
          10. Educate business stakeholders on report usage and walk through specific use cases end to end. Provide supporting technical artifacts.
          11. Deploy MVP reports to production.

          The deliverables provided were:

          1. Reliable, Real-time, Interactive Reporting Package
            • Embedded logic to identify which records required immediate attention from the Compliance Team
            • Deployed a product-centric approach to easily expand scope and scale at ease
          2. Comprehensive Data Flow Diagrams
            • Provided root cause visibility by creating detailed current and desired state system and data flow maps which facilitated efficient investigations and prevented potential fallouts before they occurred
          3. Secure Environments
            • Leveraging Cloud Gateways and ODBC drivers for security connectivity
            • Secure daily data refreshes to reflect records that were rectified the day prior
          A diagram of a diagram

Description automatically generated

          Data flow diagram showcasing the consolidation of legacy sources to provide enhanced data visibility for compliance.

          The Result

          The Kenway Team delivered a comprehensive suite of over 10 cutting-edge, user-friendly reports with export capabilities, empowering users with actionable insights. Additionally, the Compliance Team was equipped with targeted training and robust technical documentation, enabling them to conduct thorough offline investigations for emerging use cases. This dual approach of powerful reporting tools and data visibility improvements significantly enhanced the client team’s investigative capabilities and operational efficiency. 

          The reporting suite significantly reduced the manual effort required for investigations and provided more relevant insights allowing users to spend more time rectifying the system fallouts, instead of identifying them in the first place. The Compliance Team began to streamline troubleshooting by tracing system fallouts to their root cause for swift investigations and rectifications within the time span defined by the FCC. Users are equipped with an understanding of upstream systems allowing them to streamline their work by proactively addressing potential recurring fallouts and work to limit their occurrence. 

          The FCC enforces substantial fines for the failure to comply with CPNI rules, including the annual certification requirement, that may subject them to enforcement action, including monetary forfeitures of up to $220,213 for each violation or each day of a continuing violation, up to a maximum of $2,202,123 per incident. These limits apply to instances of failure to meet the FCC’s notification requirements, including any breaches related to Initial Rights Notification (IRN) or similar obligations under CPNI mandates. The proactive approach to compliance has led to the identification and correction of 2,203 unique incidents, conservatively preventing an estimated $485 million in potential future fines.

          The delivered solution is inherently scalable across multiple regions, business units, or customer bases as the business looks to expand its scope. The use of daily data refreshes, cloud gateways, and secure ODBC connections ensure ongoing compliance needs are met, and the risk of future violations as regulations or internal processes change is mitigated. Ultimately, the solution is well suited to easily be expanded and adapted to future regulatory changes and expansions.

          Conclusion:

          Throughout this solution walkthrough, we have explored how to navigate the complexities of data privacy regulations and explained how the lack of data visibility into customer data can pose a significant challenge for compliance. We have delved into the power of Business Intelligence and what it can do for any organization, particularly in terms of enabling internal Compliance Teams with the ability to identify actionable insights. 

          After partnering with Kenway, the organization had reliable data visibility for the first time to verify that they were compliant with FCC regulations. They received comprehensive data lineage documentation to streamline troubleshooting, allowing them to trace fallouts presented in the dashboard to their root cause and conduct swift investigations and rectifications. Moreover, the secure environment and daily data refreshes ensured they were always working with the most up-to-date information.

          Ready to enhance your data visibility and streamline compliance? Contact us for tailored business intelligence solutions.

          Case Study - Scalable Data Product Delivery

          Client Profile:

          A leading global financial services firm that provides wealth management, asset servicing, asset management, and banking solutions to corporations, institutions, and affluent individuals, founded over 100 years ago. This stalwart institution, bloated with legacy processes and technologies, focuses strongly on a highly tailored client centric approach. This means that enabling scalable data-driven solutions is of the utmost importance as they gear up for the future.

          Problem Overview

          A deep understanding of the customer is a powerful tool when it comes to cross selling and extracting additional value from an established relationship. Hence, an example of a data product our client was looking to deliver as part of this new data platform was a solution that modeled characteristics of existing customers - using a wide cross section of existing data - to identify a population of ‘similar’ customers. Starting with an ‘ideal’ customer, ideal because they might generate a significant revenue stream, and using this to determine a population of ‘similar’ customers the client planned to cross sell additional products and services to this newly identified group. This particular product was an experimental ML model that they hoped would yield a return. 

          Thinking more broadly about the scalable delivery of this type of data product and future - yet undetermined ideas - our client had several specific functional requirements that a data platform would need to meet in order to enable this type of experimentation and potential value delivery across the  organization.

          1. Centralize data from various sources, while keeping the data engineering effort as low as possible
          2. Standardize data in a unified Data Lake layer, while facilitating exploratory/sandbox access
          3. Enable various teams, including data scientists & BI dashboard authors, access to prepare and model data to fit their specific needs for Data Product Delivery, without impacting each other’s’ development lifecycles

          Background:

          High quality data is the first step in enabling data product delivery. Whether it be a customer segmentation analysis, performed by the marketing department to drive a more targeted campaign for a new product launch, or a cash flow modeling exercise performed by the FP&A group to manage liquidity while optimizing business performance; all depend on high-quality, accessible, and understood data.

          The typical steps taken to source organizational data for development of any data product or analysis are as follows:

          1. Identifying sources & owners
          2. Building integrations
          3. Operationalizing the integrations
          4. Analyzing and preparing the data for the intended use
            1. By this point it often becomes necessary to refine existing integrations or even source additional data
            2. This all leads to added rework and time spent waiting to actually derive real value from the analysis
          5. Finally, the company is ready to deploy the analysis or insight – in our example the customer segmentation analysis, or cash flow model.

          Only once the data product is deployed can it produce organizational value (in the form of increased revenue or reduced costs) and hence a positive return on investment, which overcomes the initial investment required to bring the product to life. Activities until this point will not yield significant value by themselves. New integrations or data engineering work won't produce value unless used appropriately and centralized data is of little value unless someone does something useful with it.

          The data lifecycle is rarely linear; integration refinement and data sourcing efforts are ongoing and iterative, hence a solution or approach with the characteristics listed below is needed:

          Approach & Technologies Leveraged:

          Our approach to data management and product delivery centers on several guiding principles as follows:

          Metadata Driven Ingestion

          Reusable Data Engineering Patterns

          Data Product Alignment

          Leverage Best-In-Class Cloud Tooling & Practices

          In our client’s use case, the solution utilized a Delta Lakehouse platform built on Azure Synapse, leveraging a medallion Data Lake Architecture – Bronze, Silver and Gold Layers – along with metadata driven ingestion and standardization processes driven via repository housed data contracts. Specific technology and approaches are listed below.

          Data Contract Driven Development

          A data contract is a document between a data producer(s) and data consumer(s). The primary purpose of a data contract is to ensure dependable, high-quality data that garners trust from all stakeholders. Data contracts are a technology agnostic solution that:

          Azure Synapse Modern Data Platform

          Kenway Consulting recommended the use of Azure Synapse Analytics as the central data processing platform. This choice was driven by the platform's integrated capabilities, which combine big data and data warehousing, allowing for seamless ingestion, preparation, and serving of data. The platform's architecture supports both on-demand and provisioned resources, offering flexibility in efficiently managing diverse workloads.

          Serverless SQL for Data Modeling

          With the standardized (silver) data residing in Azure Blob Storage, Kenway implemented serverless SQL models within Azure Synapse. This serverless approach allowed for on-the-fly querying and modeling without the need for dedicated compute infrastructure. The serverless SQL capabilities seamlessly integrated with the existing Azure Synapse environment, providing a cost-effective solution for modeling, and preparing data for analysis.

          The Solution:

          Keeping the above goal of streamlining low value activities to allow for more time spent developing ROI-generating products, our solution incorporated the following feature sets:

          1. Metadata-driven ingestion via data contracts was developed per source architype – flat file sources, database sources, and application events. This minimized the effort needed to onboard new data sources and reduced data engineering complexity as additional source objects could be added via configuration. A Silver Delta File layer, exposed via external tables in a Serverless SQL workbench, allowed easy exploration of the data via familiar tools such as SQL Server Management Studio and Power BI.
          2. Gold Product, use case-specific, Serverless databases driven off the same underlying Silver Delta files, allowed different teams to model and conform data as required for their use cases, while governance was established over the underlying shared silver assets using GIT managed data contracts; exposing familiar change control patterns as would be employed in any SDLC, such as branching, version history, code review and merge approval processes.
          A diagram of a computer system

Description automatically generated

          Key to this solution and approach was a data contact-driven development approach which specifically enabled the following:

          Conclusion

          Kenway created a unified data platform that supports rapid and efficient data product delivery, aligning perfectly with the client's needs.

          How Kenway Consulting Can Help

          At Kenway Consulting, our Modern Data Enablement services are designed to help organizations capitalize on data as a strategic asset. We leverage cloud technology and a composable data ecosystem to optimize data utilization and analytics. Our approach focuses on integrating data and analytics into your business strategy, driving data quality, automating data consolidation, and delivering actionable insights to key stakeholders.

          Why Choose Kenway?

          In summary, by partnering with Kenway Consulting, our client benefited from a comprehensive and scalable data solution that streamlined their data processes, enhanced data quality, and enabled rapid deployment of data products. This case study underscores our commitment to delivering tailored, high-impact solutions that drive business value and support strategic objectives in the financial services industry. If you’re interested in discovering how Kenway Consulting can help your organization leverage scalable data solutions and enhance your financial services data management, please reach out to us.

          Transforming Transaction Throughput: FinTech App Development

          How Kenway Consulting’s innovative approach and deep technical expertise allowed a FinTech start-up to overcome significant performance barriers and scale their payment processing application by 450% in just 30 days.

          Client Profile:

          Background:

          Kenway partnered with a start-up company specializing in FinTech application development that offers a suite of payment processing solutions that help people and organizations securely send money worldwide. Their payment products are used by organizations of all sizes, including Fortune 500, and have allowed them to securely send billions of dollars each year. Kenway was brought on for their application development and Azure expertise to help improve the performance of one of their new payment processing applications.

          The Problem:

          One of the client’s products, an application to process and validate transactions, was in late-stage development, and was functionally ready to be brought to customers. While the client was unaware of the volume required by customers, in terms of transactions per second, early load testing suggested that the original application could only support about 60 transactions per second. The initial architecture (see below) involved app services directly inserting records into a transactional database, leading to concerns around bottlenecking, and potential loss of records during API outages or periods of high volume.

          A picture containing timeline

Description automatically generated

          During contract negotiations, the client felt that their current performance (60 transactions/second) may not be sufficient to satisfy the volume demand from clients for their FinTech application. Further, this requirement to improve transaction throughput coming late in the development cycle, left only a month to find a solution that would provide a significant improvement for the product. While the database itself was likely the restricting factor, the short timeline dictated the need for a cost-effective, innovative, efficient solution that would provide the needed throughput quickly.

          The Solution:

          Because the client had already scaled the existing database as big as possible within cost constraints, Kenway embarked on envisioning a new process. By only subtly modifying the existing architecture and configurations, Kenway was able to provide a cost-effective and very fast-to-implement solution that could support the required throughput.

          Kenway reviewed the existing code and executed baseline and load tests against the architecture to understand the current capacity and find potential inefficiencies causing latency or errors in transmission. After making minor improvements to remove inefficiencies causing latency, Kenway configured staging resources in between the original app services and the existing database.

          By segmenting the existing code base and introducing a staging area that could effectively throttle the number of entries being sent to the existing database, Kenway was able to reuse the architecture and avoid modifying the database itself. After re-platforming the original app services to function with a http trigger, Kenway introduced a storage queue to ensure that zero records would be lost during high volume transmission. To complement, the process was designed with a storage trigger using blob storage to hold the full transaction payloads, before it is sent to the final database.

          On the other side of the storage queue, Kenway implemented function apps to directly pull messages from the blob storage and write entries to the database, a solution that is both elastic and able to be scaled horizontally. Because customer requirements dictated the need to scale based on number of transactions, new app configurations allowed for event-driven scaling to both increase and decrease capacity of the applications based on current transaction volume or even potential API outage. The pairing of these Azure functions together allows for the designed app services to continue to work at their full potential, while eliminating the bottle necks and realized risks that stemmed from the direct connection between API and database.

          Diagram

Description automatically generated

          The Result:

          1. Cost Effective and Fast Delivery

          Faced with an extremely short timeline of one month to increase transaction throughput as much as possible, Kenway stepped in with expertise in FinTech application development, applied existing knowledge and experienced resources, and provided an innovative, fast, and comparatively, cheap solution. Kenway’s deep experience with Azure, partnered with a mindset of determination and creative thinking, allowed Kenway and the client to experiment confidently towards an unknown goal.

          1. Volume

          During the rapid redesign phase along with continued contract negotiations, the client’s customer’s needs became defined, a peak requirement of 180 transactions per second. 

          By introducing this staging area, backed by the Azure cloud, the client was able to push their current architecture to comfortably process 250 number of transactions per second, all without major changes to the codebase or existing logic.

          1. Security and Guaranteed Delivery

          Equally as important as volume, the introduction of a storage queue combined with function apps guarantees that all transaction records will be stored securely short-term before being successfully delivered to the master database, a necessary function for this custom application.

          1. Scalability and Long-Term Solution

          The inclusion of both a storage queue and function apps provided an elastic solution that can, and will, scale with the introduction of new databases as customer requirements continue to change and demand more volume. The new scalable architecture will support a potentially very high throughput and successful delivery to the database with very minimal changes in the future. While the goal was met, Kenway additionally made other recommendations on long-term code improvements and other inefficiencies still existing with original app services and database.

          CONCLUSION

          By leveraging Azure cloud solutions and designing a scalable, cost-effective scalable architecture, Kenway not only met the client’s immediate needs but also provided a sustainable framework for future growth. Kenway’s ability to swiftly diagnose bottlenecks and implement cutting-edge solutions demonstrates our commitment to delivering high-impact results on aggressive timelines. If you are facing similar challenges or are interested in exploring how Kenway can help optimize your operations, please don’t hesitate to contact us to start the conversation.

          Application Optimization: Maximize Your Enterprise Applications

          Maximize the value of your investment in enterprise applications by developing lightweight custom integrations that can be delivered in weeks.

          In today's dynamic business environment, enterprise applications are crucial for streamlining operations, enhancing productivity, and driving growth. However, the cost of licenses for enterprise applications often leads to either overspending on licenses that are not fully leveraged or underutilization due to limited access, resulting in inefficiencies and significant financial burdens. By implementing lightweight custom integrations, organizations can extend core functionalities to a wider user base without the burden of additional license fees, ensuring both cost-efficiency and broader access to critical tools and data. This approach not only bridges the gap between enterprise application capabilities and user needs but also maximizes the overall value of your investment.

          A Custom Solution for Cost-Efficiency and Value Maximization

          These custom applications provide a cost-effective solution for application optimization that enhances functionality and extends access without the need for additional expensive licenses.

          Here’s how this approach benefits organizations:

          1. Tailored Functionality for Specific Needs: By analyzing the specific requirements of different user groups within the organization, custom application optimizations can be designed to provide only the necessary functionalities. This targeted approach ensures that users have access to the tools they need to perform their tasks efficiently without the overhead of unused features. For example, a custom application might allow lower-permissioned users to log data, view reports, or manage simple workflows, aligning costs with actual usage.
          2. Extended Access and Increased Data Insights: Providing access to custom applications for users without licenses allows them to both contribute and view valuable data. For instance, a field technician could use a lightweight custom app to log maintenance activities directly into the enterprise system and access relevant data, such as maintenance schedules and equipment history. This enriched data pool enhances operational insights, and the ability for users to view data supports better decision-making and a more comprehensive understanding of the organization's operations.
          3. Cost Savings and ROI: Custom applications significantly reduce the need for additional full licenses, leading to substantial cost savings. The ongoing maintenance and technology costs associated with these applications are typically much lower than the costs of additional licenses. This reduction in expenditure improves the return on investment (ROI) for the enterprise application, making it a more financially viable asset.
          4. Scalability and Flexibility: Custom applications can be scaled and modified as the organization’s needs evolve. This flexibility ensures that the solution remains relevant and effective over time, adapting to changes in business processes and user requirements. It also allows for the integration of new features or the expansion of existing functionalities without the need for major overhauls or additional licensing costs.

          Leveraging AI for Rapid MVP Development

          Having developed these lightweight applications through application optimization, one of the key advantages Kenway offers is the ability to deliver an MVP (Minimum Viable Product) very quickly using AI-enabled development. By doing so, we can rapidly generate the foundational code base, allowing us to deliver the core set of functionalities to users in a very short time frame.

          This AI-driven approach accelerates the development process significantly, enabling us to:

          1. Quickly Validate Concepts: With AI-enabled development, we can quickly prototype and validate concepts. This rapid iteration allows stakeholders to see and interact with the core functionalities early in the process, providing valuable feedback that can be incorporated into the final product. This iterative approach ensures that the end solution is closely aligned with user needs and expectations.
          2. Reduce Development Time and Costs: The efficiency of AI in generating code and automating repetitive tasks reduces the overall development time and associated costs. This cost efficiency is particularly beneficial for custom applications, where budget constraints are often a concern. By minimizing development time, we can deliver high-quality solutions without compromising on functionality or performance.
          3. Enhance Collaboration and Innovation: AI-enabled development tools facilitate better collaboration between development teams and stakeholders. The ability to quickly produce working prototypes encourages more frequent and meaningful interactions, fostering a culture of innovation and continuous improvement. This collaborative environment helps in identifying and addressing potential issues early, ensuring a smoother development process.

          Considerations for Successful Implementation

          While the benefits of these custom integrations are clear, there are several considerations to keep in mind to ensure they are implemented most effectively:

          1. API Limits and Contracts: When designing these custom applications, it is essential to be mindful of API limits and contractual obligations with enterprise application providers. Overuse of APIs can lead to additional costs or service disruptions. Therefore, careful planning and monitoring of API usage are crucial to maintain a balance between functionality and cost-efficiency.
          2. Security and Compliance: Ensuring that custom applications adhere to the organization’s security policies and compliance requirements is vital. This involves implementing robust authentication mechanisms, data encryption, and regular security audits to protect sensitive information and maintain regulatory compliance.
          3. Performance and Reliability: Custom applications must be designed for performance and reliability to ensure they can handle the expected load and provide consistent service. Regular performance testing and monitoring can help identify and address potential issues before they impact users.

          Conclusion

          In an era where application optimization, cost-efficiency and value maximization are paramount, lightweight custom integrations offer a compelling solution for organizations looking to optimize their use of enterprise applications. By addressing the challenges of high licensing costs and limited user access, these custom solutions enable companies to fully leverage their technological investments, enhance data collection and decision-making, and achieve a higher ROI. This value can be realized quickly and iterated upon by leveraging AI-enabled development and introducing an MVP in aon a short time frame.

          The goal of these lightweight applications is to help organizations navigate these challenges and unlock the full potential of their enterprise applications through innovative and cost-effective custom solutions. By focusing on tailored functionality, extended access, and careful consideration of API limits and security, you can create a more efficient and productive business environment that drives growth and success.

          Get In Contact With Us

          Kenway is excited to partner with you to create the technology landscape that will realize the most value out of your enterprise applications, optimizing your licensing costs and expanding your user bases. We would like to offer a complimentary recommended solution along with a ROI analysis to quantify the benefit of pursuing this approach.  Contact us today to get started.

          Implementing & Supporting Contact Centers for Single Computer-Telephony Integration (SCTI) 

          A Multi-Faceted Approach to Modernizing Contact Centers for a Leading Fortune 50 Telecom Provider 

          CLIENT PROFILE 

          Industry: Telecommunications 

          Client: Fortune 50 Telecom Provider 

          Solution: Contact Center Solutions, Enterprise Program Management

          BACKGROUND 

          In an increasingly competitive business landscape, contact centers face the constant challenge of optimizing operational efficiency while enhancing customer experience. As businesses expand, whether through acquisitions, new service offerings, or growing vendor reliance, contact center infrastructures tend to fragment, which can significantly affect performance. With the growing need to support omnichannel customer communications—such as voice, chat, and video—the challenge becomes even more pronounced. 

          Recognizing these complexities, a Fortune 50 telecom provider sought to transform its contact center platform to drive efficiency and enhance customer experience. Their goal was to migrate more than 30,000 agents spanning over 400 centers and 16 countries from a legacy system onto a modern, unified platform. Kenway Consulting was engaged to lead this initiative, leveraging its expertise in Enterprise Program Management and Contact Center Solutions

          THE CHALLENGE 

          The client's contact center infrastructure was fragmented, largely due to legacy systems and acquisitions over time. With tens of thousands of agents across multiple lines of business, each with distinct requirements, migrating to a unified platform presented several key challenges: 

          Kenway was tasked with leading the migration while navigating these obstacles, implementing a unified system that would streamline operations, and enhancing the customer journey. 

          THE SOLUTION 

          Kenway’s involvement spanned several years, working closely with the client to tackle the complexities of modernizing its contact center infrastructure which included multiple Computer Telephony Integration (CTI) systems to a single CTI vendor. By providing program management services, vendor management, and routing architecture and support Kenway ensured a seamless transition.

          1. Enterprise Program Management 

          Kenway implemented a robust Enterprise Program Management approach to ensure alignment across all projects. The team facilitated regular stand-ups and deployed program-wide tools that enhanced communication across 20 project managers and technical SMEs. By doing so, Kenway was able to accurately forecast and keep project teams on track even when there were significant delays due to supply chain issues of crucial server components which delayed the project by 6 months and assist with the executive communications. 

          Key areas of focus included: 

          2. Routing Architecture and Support

          One of the most challenging aspects of the migration was developing a new call flow strategy while minimizing disruption to ongoing operations. 

          3. Testing Services 

          To ensure the infrastructure was robust and functional, Kenway played a key role in establishing and managing comprehensive testing services across various phases of the project. 

          4. Migration and Go-Live Support 

          Kenway's support was critical during the migration and go-live phases. From preparing agents to ensuring a seamless transition, the team handled multiple moving parts, including: 

          5. Customer Experience Enhancements 

          While Kenway’s primary focus was on the migration, the team also identified additional customer experiences and applications that needed to be migrated onto the central virtual assistant platform. These applications utilized the timing of the platform change to incorporate enhancements during their migration period so the training could be incorporated holistically. Kenway identified the opportunity to help get these enhancements complete before migration live date and assisted with getting this incorporated into the overall program training curriculum. One such example is the agent desktop application, the tool an agent utilizes to manage a customer’s call. Kenway saw a gap in how enhancement requirements from the business were getting to the third party vendor to develop. Kenway stepped in to host requirements grooming sessions, create a backlog of enhancement requests, and see to its successful production deployment to time with migration live date.   By going above and beyond just the migration scope, Kenway optimized the agent and customer experience beyond just the platform switch. 

          WHAT WE DELIVERED 

          Kenway’s comprehensive approach delivered significant value across multiple areas of the project: 

          RESULTS  

          By leveraging Kenway’s expertise, the client successfully completed the CTI migration while enhancing their contact center operations. The result was a scalable, flexible platform capable of supporting the company’s evolving business needs. Key outcomes included: 

          Kenway’s strategic approach, technical expertise, and commitment to operational excellence were crucial to the successful execution of this complex, multi-year transformation. 

          CONCLUSION 

          The successful implementation and migration of the CTI platform highlights the critical role of strategic collaboration, technical expertise, and effective program management. Kenway facilitated this transformation by coordinating complex vendor landscapes, optimizing call flows, and leading thorough testing efforts. As a result, the client was able to modernize their contact center infrastructure, improve customer experience, and establish a scalable foundation for future expansion. 

          Discover how our customized approach to contact center modernization and enterprise program management can optimize your operations and enhance customer satisfaction. Contact us today to explore how we can help your organization achieve operational excellence. 

          Case Study: A Comprehensive Approach to Data Governance Strategy and Data Management - Assessment, Recommendations, Roadmap, and Implementation

          CLIENT PROFILE

          BACKGROUND

          Enabling data-driven decision-making is a key component of maximizing success in today’s business world, regardless of industry or organization. To be effective, data needs to be complete, accurate, and reliable – a clearly defined Data Governance strategy will ensure that is the case.

          Without an effective Data Governance strategy, there are likely to be insufficient or ineffective data policies and procedures. This can lead to poor data quality and ineffective decision making, as insights into patterns, preferences, issues, root causes, and associations could be incorrect.

          To build a reliable platform, companies need to start with a clearly defined Data Governance strategy, which then becomes the primary driver to enable the implementation of effective Data Management across the enterprise.

          Once implemented, a Data Governance strategy will ensure accurate and trustworthy data, which will then guarantee the tools to support data virtualization and/or visualization are reliable and impactful. This allows companies to begin to focus on more advanced data strategies around artificial intelligence (AI) and Advanced Analytics, which can include machine learning, predictive analytics, statistical modeling, etc.

          THE PROBLEM

          The client, a software and support organization for small businesses, is gearing up for an Initial Public Offering (IPO). They faced the challenge of demonstrating robust data governance practices to satisfy regulatory requirements and attract investor confidence as well as supporting their increased investment in analytical capabilities. Their existing data management policies and decision-making lacked the necessary transparency and accountability for the scrutiny of public markets.

          THE CHALLENGE

          The client was looking to have organized and standardized approaches to Data Management and Data Governance, which was a strategic end goal and a top priority across leadership at the organization. 

          However, with inadequate Data Governance strategy in place, the client was struggling to address prevailing pain points that manifested within reporting, business intelligence, and analytics platforms in the form of conflicting versions of the truth and extensive manual efforts. The client was at risk of failing to achieve its strategic vision.

          As a result, there was a strong desire to mature the approach to data to effectively solve current system issues and create a foundation for successfully leveraging data through accelerated development of technologies like data warehousing and business intelligence.

          THE SOLUTION

          Acknowledging the need for a transformative approach, Kenway launched a phased approach centered around performing a Data Governance Maturity assessment to help the client understand where the gaps were, providing a set of recommendations, building the related roadmap defining a path to address the underlying issues, and implementing the recommendations defined in the roadmap. 

          Data Governance Maturity Assessment

          The client shared a long-term vision which sought to simplify its data experience but was unsure how to go about implementing the changes needed to deliver such an experience. The organization required a current state assessment to surface underlying issues, determine gaps, and understand root causes for the pain points it was experiencing. 

          Assessments are key to understanding existing processes and capabilities. Lack of an assessment can lead to unrecognized gaps and/or missed opportunities to improve certain aspects of Data Governance/Management, ultimately limiting the client’s ability to meet future state goals and align with their vision.

          Kenway’s assessment approach included reviewing existing data processes and interviewing various key employees across the Finance, Accounting, Customer Success (Operations), and Business Intelligence business units. The interviews focused on the following general areas:

          The Kenway team then synthesized the findings from the stakeholder interviews and workshop to document the current state high-level, data-related processes and procedures and the related pain points encountered across the organization.

          The current state was then assessed based on Kenway’s Data Governance assessment criteria:

          Rankings for each of the criteria were aggregated and aligned to the Data Governance Maturity Curve for the organization, which provided the client with insight on their Maturity Level.

          With the assessment completed and the Data Governance Maturity defined, Kenway was able to provide the client with a set of recommendations and a related roadmap to help address the pain points and achieve its strategic goals while increasing its Data Governance Maturity. This led into the second stage of the engagement, the implementation.

          Recommendations & Road mapping

          Based on the findings, Kenway ultimately defined a set of key recommendations for the client:

          A roadmap was then established to define a path toward implementing change that would address the recommendations and deliver meaningful and measured value over time. This is shown below:

          Implementation

          Taking the roadmap from the assessment stage, the Kenway team began the Data Governance implementation by establishing, assessing, reviewing, and validating the foundation. This started by securing organizational buy-in and authority. Kenway took a bird’s eye view, examining at an enterprise level, to understand who within the organization had its best interests in mind and held adequate coverage across the enterprise to enable decision-making capabilities. Kenway facilitated conversations to ensure groups identified would work well together. Kenway facilitated conversations to ensure groups identified would work well together.

          Once a Data Governance Charter was defined, and ideal candidates had been selected to create the Steering Committee, Kenway proceeded to create an Operating Model and establish key success metrics. This choreography is captured in the diagram below.

          Identifying and onboarding the right participants across the enterprise with decision-making capabilities to represent the Steering Committee posed a hurdle as decision-makers have limited capacity to provide. Kenway entrusted these individuals by communicating the importance of Data Governance within the organization, having numerous conversations with each in alignment with a strategic change management plan comprised of risk assessments, and necessary training and communication plans to aid key decision makers on the committee.

          Once the foundation was in place, the next stage of the initiative was to identify one high priority problem or opportunity for the organization to tackle. The diagram below outlines the circular art of accelerating, refining, and reaching market success by solving business problems at hand.

          Throughout the journey, obstacles to ensuring the Steering Committee could function together and make decisions in a timely manner were faced. To mitigate this challenge, Kenway facilitated the conversations to ensure folks were examining all considerations, and everyone was confidently representing their scope of business and ultimately reaching a decision unanimously. Kenway shared industry best practices, and expertise from prior implementations at other clients, providing the Steering Committee with unique perspectives on what went right and what went wrong at these other firms.

          Kenway also leveraged an in-house decision-making framework to aid the Steering Committee to make decisions that allowed for open and candid feedback with the structured decision-making process. Kenway adapted its own style to align with the existing framework that was more familiar at the organization.

          RESULTS

          The definition of a Data Governance Charter outlining roles and responsibilities, along with the formation of a Steering Committee and the implementation of a Data Governance Operating Model have empowered the client with the capacity to streamline data-based decisions and gain confidence in their data

          Implementation and BAU Data Governance Activities are now supported by change management practices including a Training Plan, Communication Plan, and Change Management Strategy

          The client’s systems and policies are updated and aligned with Data Policies with support of an extensive documentation comprising Data Lineage Diagram, Data Catalog, Data Classification, System Flow Diagram. This has permitted the generation of enterprise-wide data ownership and accountability Path to Growth:

          In collaboration with Kenway, the client has a comprehensive backlog of opportunities to tackle in subsequent iterations, as well as Data Governance talent acquisition job descriptions and recommendations, both internally and externally

          The client did an excellent job of marketing their initial successes with an enterprise level communication plan, outlining the progress made, steps to maturity, and a plan for continued growth.  

          CONCLUSION

          The effective implementation and operationalization of a data driven decision making framework highlights the importance of strategic collaboration, technical expertise, and adaptive Data Governance and Data Management policies.

          The results of this synergy have allowed the client to realize ROI on their technology investments and fully utilize the tools being implemented.

          Explore how our tailored approach to data governance and management can transform your organization's decision-making capabilities. Contact us today to discover how we can help you achieve data-driven success. 

          Case Study: Crafting a DevOps Strategy for Seamless CCAI Implementation

          Client Profile

          Industry: Telecommunications

          Client: Fortune 50 telecom

          Solution: DevOps Strategy Implementation

          Background

          The client’s Contact Center architecture stands as a beacon of sophisticated design, underpinned by a well-established DevOps model that efficiently catered to both Voice and Digital channels. This intricate framework was the product of years of evolution, fine-tuning, and a deep understanding of the operational dynamics necessary to deliver exceptional customer service. The DevOps model in place facilitated a seamless flow between development and operations, ensuring that any updates, bug fixes, or new features could be rapidly deployed with minimal disruption to service.

          The Challenges

          The decision to embrace Google Contact Center AI (CCAI) pioneered a new era of innovative solutions, accompanied by a myriad of challenges:

          The client team needed to integrate applications built on an external CCAI platform without causing disruptions. The team had to ensure that the introduction of CCAI would not disrupt the finely tuned balance between development and operations within their existing DevOps model. They also had to showcase strong strategic foresight for the intricacies of cutting-edge technology.

          In addition, they needed to seamlessly lace innovative CCAI capabilities with existing people, processes, and tools. The introduction of CCAI was not merely a technical task but a cultural one, requiring a delicate balance between development and operations within their DevOps model—a cultural shift entailing recalibrating pipelines, roles, and organizational restructuring. 

          Beyond technical integration, the team grappled with aligning aspects of their customer service philosophy with the transformative potential of CCAI. This necessitated more than preserving existing standards; it demanded an evolution that augmented their service offerings. 

          The client quickly recognized the profound implications of this shift on customer service, operational efficiency, and their competitive edge in the market. It was not merely about adopting new tools; it entailed reimagining the entire customer experience landscape. 

          By embracing CCAI technology and adapting workflows to align operations with the new technology requirements, the client positioned itself as both an innovator and a steward of customer satisfaction and industry leadership. This strategic foresight paved the way for a seamless transition that not only met but exceeded expectations, ensuring they remained at the forefront of customer service excellence in an ever-evolving digital landscape.

          The Solution

          Through dynamic workshops and strategic discussions facilitation, Kenway united the client and Google teams to establish a refined and effective DevOps strategy tailored for the CCAI implementation.

          The Kenway team leveraged its Contact Center expertise, understanding of the client, and foundational knowledge of efficient DevOps processes, which led to developing and implementing an effective DevOps model which was crucial for the successful deployment of CCAI through a regular release cadence.

          Kenway hosted collaborative sessions as part of Kenway’s framework. These sessions provided a platform for mutual learning and exchange. Google gained insights into the client’s methodologies, while the client received an in-depth understanding of CCAI strategies. Kenway skillfully adapted existing practices, defining a healthy and effective DevOps framework including pipelines and workflows across impacted personas.

          The designed framework facilitated the efficient rollout of weekly updates for CCAI into production, critical for the program's success to prevent service interruption and modifying established BAU processes. 

          The Results

          The transition to CCAI has positioned the client to better meet customer needs and stay ahead in a competitive AI market. By leveraging the capabilities of CCAI, the client is now better equipped to handle a wide array of customer interactions, improving overall satisfaction and engagement. The client’s ability to seamlessly integrate cutting-edge technology while maintaining its core service ethos has set a new standard for the industry, showcasing the potential of leveraging AI to enhance customer experience and operational capabilities.

          The implementation of a robust DevOps strategy tailored for CCAI has not only streamlined operational processes but also paved the way for future innovations by supporting the ability to deliver consistent quality, quickly, at scale. 

          Conclusion

          The successful integration of Google Contact Center AI into the existing architecture underlines the importance of strategic partnership, technical expertise, and adaptive DevOps practices. The results of this collaboration have demonstrated significant advancements in operational efficiency, customer service, and strategic innovation. 

          Ready to revolutionize your customer interactions with a refined DevOps strategy? Contact us to unlock the full potential of CCAI implementation!