Power BI vs. Tableau vs. Qlik: Which BI Tool is Best?

“Just give me the data.” When Kenway Consulting engages in a Business Intelligence (BI) project, many of them begin with that simple phrase— “Just give me the data.” Organizations want their data from various source systems in the hands of their power users. Doing so allows them to leverage the industry expertise and analytical mindsets for which they hired these resources. To maximize our value during a BI project, we believe in getting our clients the data that addresses their highest impact business questions early in the data discovery phase and then iteratively developing it in an in-memory data visualization tool.

We use in-memory analytics and data visualization tools because they allow:

However, just as no two clients’ needs are the same, we have learned that we cannot simply pick one tool to address every engagement. In an effort to best serve our clients, Kenway recently undertook a hands-on research project to vet Power BI vs Tableau vs Qlik.

Power BI vs. Tableau vs. Qlik: Our Research

Here is how it worked. We built a report in Qlik Sense, and used it to provide a benchmark against two major competitors: Microsoft Power BI and Tableau so we could compare Power BI vs Tableau vs Qlik. We reviewed the products on their ability to fulfill a few of the common use cases we have seen with our clients:

Before we begin our intergalactic adventure in data, here is some background on the exercise:

So let’s compare Power BI vs Tableau vs Qlik!

Data Extraction

Directly importing our data files using all three tools was quite easy. They all had user-friendly data loading wizards that allow you to quickly find files on your hard drive, make some minor manipulations, and incorporate them into your application.

The most striking difference was the number of data sources available via the versions we used. Power BI Desktop led the way in this category—out of the box, it allows users to utilize the wizard to extract from various file structures, databases, online services, and other applications. Qlik Sense also allows for a large spectrum of data sources to be incorporated; however, it requires a bit more technical savvy and/or searching to do so. Tableau Public limits users to local files, OData connections, and the Azure Marketplace DataMarket. However, if you choose to upgrade to the Professional Version, you get access to the same breath of sources as above and out-of-the-box connectivity as Power BI.

Outside of using the data loading wizards, Qlik Sense and Power BI provided much more robust scripting languages than Tableau. Qlik Sense’s data load editing language resembles SQL, a language familiar to many people with database experience. Power BI utilizes a language called Power Query. It is similar to F#, an object-oriented coding language. Tableau’s data loader allows users to make minor transformations for a loaded dataset (adding calculated values, grouping values, defining joins between tables, etc.); however, its lack of a coding language limits the number of tasks you can accomplish. For most use cases, the data will have to be prepared at the source level (e.g. modifying the files, creating views and/or tables in the desired model, etc.).

Once the data was loaded into the applications, Qlik Sense is able to differentiate itself from the other two products by the final data model it is able to utilize. Qlik’s associative data model allows Qlik Sense to string together connections between each table with every other table in the data model. This allows users to develop unique analyses across seemingly disparate data tables. While Tableau and Power BI are also able to bring in multiple data sets and data sources into their models, as users add on varying layers of complexity to the data model, they must also be more cognizant of the impacts on the data model.

For more information around each application’s connectivity, scripting, data load times, data compression abilities, and data modeling strengths and weaknesses, please see our full Data Wars Whitepaper.

Data Loading Breakdown

Executive Dashboard

Not surprisingly, all three of the tools were able to address our baseline reporting case—the Executive Dashboard.

As you can see, each tool was able to make a polished, user-friendly dashboard. Users are able to make line charts, scatter plots, and bar charts easily and can enhance them by adding filters. Furthermore, each of them supports a community of custom developed add-ons. The one we used here is by our friends at Tableau Augmented Analytics, formerly Narrative Science, (denoted by their logo). They have developed an add-on for Qlik Sense, Power BI, and Tableau that creates text summaries of your visualizations.

When it comes to Power BI vs Tableau vs Qlik from a default visualization standpoint, Tableau and Power BI came with more visualization types than Qlik Sense. While utilizing Qlik’s marketplace and customizing its standard visualizations allows Qlik Sense to make up some ground, this could be overly burdensome for less technical audiences.

Ultimately, we give a slight edge to Tableau in the visualization creation and organization space—the application’s interface has users create objects in separate tabs and then consolidate them into a single dashboard using a drag and drop design.

Tableau and Power BI also have an advantage when it comes to data manipulation on the visualization layer. They provide the user with wizards on the visualization layer to group fields, create hierarchies within fields, apply rules to fields, and create auto-filters for fields. The uses for these can range from making calendar fields (month, quarter, year, etc.) to developing drill down logic.

If users are embarking upon data discovery exercises, Qlik Sense’ white-green-gray filter functionality differentiates it from the other two. The white-green-gray color pattern defines whether a field is included in the current set, directly chosen for the current set, or excluded from the current set, respectively. This is useful in highlighting items like missed opportunities.

For further details around how the tools recognize field types (dates, locations, etc.), allow for heat map creation, enable users to build custom fields, and facilitate data discovers, please read our Data Wars Whitepaper.

Executive Dashboard Development

Customer Segmentation

With the basic use-cases covered, we wanted to see which tool handled some of our more complex business needs. The first that we looked into was customer segmentation. Many of our clients look to group their customers based on dynamic, automatically updated business rules. As this dataset was sales data, we decided to try and group them using the following:

Impressively, all of the tools were able to accomplish this segmentation. We used Qlik Sense’s and Power BI’s aforementioned scripting languages to develop these into the data model. For Tableau, we were able to string together multiple custom fields in the visualization layer to develop the needed segmentations.

Flows

Another key transformation in which our clients have found value is flows. This is used in customer service routing, order fulfillment, customer purchase pattern analysis, and other examples. Because of the ability to create custom scripts in Qlik Sense, we are able to recreate the logic for these. While we were unable to accomplish this with Power BI, we believe it could have been re-created with more time. Tableau would require the data to be prepared outside of the tool, likely in the source system.

For more information around how customer segmentations and flows were incorporated into the tools, please see our full Data Wars Whitepaper.

Summary

Here’s what we learned comparing Power BI vs Tableau vs Qlik:

Enjoyed our journey, we hope you did—we certainly learned a lot and got to geek out a little. Stay tuned for more information as these tools evolve and shift and new tools are added to the in-memory analytics ecosystem. Maybe we'll compare Power BI vs Tableau vs Qlik again. Or perhaps a new data visualization tool with in-memory analytics will arise.

Want to learn more about Kenway’s experience with Business Intelligence and data visualization tools? Drop us a line at [email protected].

 

What-If? Understanding Uncertainty with Power BI

With the right tools, incorporating What-If analysis into existing reporting is quite straightforward. Modeling capabilities can be married with dynamic visuals, giving the end user even more power and flexibility when viewing their organization’s data. Tools such as Power BI, Microsoft Excel or Google Sheets can help your team not only answer the question of “What happened?” but also answer the question “What could happen?”

In the everchanging environment that businesses are being pushed to operate in, having these answers is crucial. Read on to learn more about how you can begin to understand uncertainty with Power BI.

WHAT IS POWER BI? 

Power BI is Microsoft’s offering in the crowded business intelligence space. While many of its competitors are well established (Tableau, Qlik, etc.), Power BI holds its own in terms of capabilities, and has the added benefit of being fully integrated within Microsoft’s platform. In Gartner’s 2022 rankings of analytics and business intelligence platforms, Microsoft continues to be in the position that is furthest along for Completeness of Vision and the highest in the Ability to Execute within the Leaders quadrant.

power bi 2022

As described by Gartner, the chief appeal of Power BI is its ubiquity. Gartner states, “many large organizations already own Power BI through enterprise software agreements,” and the familiarity many users have with other Microsoft products (e.g., Excel) leads to a short learning curve with Power BI.

When it comes to core functionality, Power BI is on par with the leaders. While its visuals might not be as polished as Tableau’s, they are intuitive and appealing. Similarly, while its data transformation capability might not be as robust as Qlik Sense’s, it has an ETL capability and can therefore deliver on the requirements of the vast majority of analytics projects.

HOW CAN WHAT-IF PARAMETERS BE USED FOR REVENUE FORECASTING? 

At Kenway, we have found What-If parameters immensely helpful with the forecasting process. Having built out our internal reporting suite in dynamic Power BI dashboards, we can rapidly pivot from viewing best-case and worst-case scenarios when planning for the next 6-12 months.

For instance, our revenue forecasting is built bottom-up from our internal data. This data is based on deals we have already won and entered into our ERP system. While this is very accurate for the near term, forecasting further into the future can be difficult as a larger percentage of those deals haven’t been won yet and therefore do not appear in the ERP system. To better understand this component, we can layer in data from our CRM pipeline, but this brings uncertainty with it since we must estimate how likely a project is to be won.

Read more about we’ve helped our clients achieve success with Power BI in this case study.

HOW TO LEVERAGE THE WHAT-IF ANALYSIS

Enter the What-If parameter. Though we have a good idea of how many projects we might win based on historical trends, we can use What-If parameters to explore how our revenue forecasts change with different future project mixes.

For even more flexibility, we can use multiple What-If parameters at once. In this example, number of projects, hours per project, and rate per project are all configurable parameters. The product of these three values is what gets added to the revenue forecast but, because of the Power BI functionality, there’s no limit to how intricate you can get with the calculations. In the below video, see how changing all three of these values alters the overall revenue forecast.

In the video below, see how a project planner can swap resources, hours per resource, and project bill rate to rapidly fine tune the staffing mix for a project.

This example shows how an organization can utilize What-If parameters to forecast at the macro level, and it’s not hard to imagine how the same concept could apply at a more micro level on an individual project basis. A common challenge Kenway faces as a consulting firm is optimally staffing our projects. When a project is won, we generally know how many hours it will take and what bill rate the client will pay. We must then decide how to staff the project, so the work is completed on time and, if possible, under budget. By setting up What-If parameters based on hours each resource on a project works, we can combine our internal data with these results to fine tune the best staffing plan to move forward.

VALUE OF WHAT-IF ANALYSIS AND MODELING

The advantages of moving away from static reports toward business intelligence tools, like Power BI, extensively documented elsewhere. What many organizations are seeking is action - how do they take the next step and get more insight out of the reports and dashboards that these applications provide to drive actual change?

What-If analysis can deliver exactly that. Often, even organizations with mature data visualization capabilities keep their decision modeling separate from their dynamic reporting. They tend to use tools like Power BI to create interactive reporting that displays data trends that are easy to filter and drill into, but to model out different potential scenarios they still resort to classic modeling in Excel. 

There are many tools that are capable of creating these What If scenarios including tools, you may already have such as Microsoft Excel and Google Sheets. Below, we’ll go into detail on how to use a particular tool, Power BI, in creating What If Scenarios. 

HOW TO CREATE A WHAT-IF SLICER IN POWER BI

So how exactly does a What-If analysis work in Power BI? Fortunately, it’s almost as simple as clicking a button. Specifically, the “New parameter” button on the Modeling tab, helpfully labeled with “What if.”

1. Leverage the “New parameter” tool in excel:

 

2. Specify Attributes of the Parameter:

Clicking this button brings up a window where you can specify the attributes of the parameter. Along with a name, you can provide the data range of available options to select. Choosing this data range is the key to how the parameter works. The end user will still view the parameter as a truly open-ended What-If question, but Power BI will treat their selection as if they simply filtered the underlying mini dataset of values. Indeed, the input window even has a checkbox that lets you add a slicer to the page upon creating the What-If parameter. This lets the user choose the parameter’s value using Power BI’s traditional slicer visual.

With this slicer, the user can either enter in a value in the free response box or drag the slicer to incrementally change the parameter’s value. The increment you specify in the setup window will determine the sensitivity of the slider, so if you want users to choose between a wide range of data, use a larger increment.

3. Create Calculations

For the developer, the selected parameter value can then be incorporated into calculations so that metrics change dynamically along with the user’s selection. As part of the parameter creation, Power BI automatically creates a table with the data range of values along with a measure that returns the user’s selected value (if you’re familiar with DAX, it does this by leveraging DAX’s “SELECTEDVALUE” function).

Since it’s a measure, it can be easily added into any existing DAX calculation in whatever way makes the most sense for that parameter. For example, if the parameter represents a percent change then it might multiply an existing calculation. If it represents a nominal change, it might just be added to an existing measure.

4. Continuing to Optimize 

Leveraging the actions in steps 1-3 will allow you to reach insights quicker through the use of custom parameters that fit your company’s unique needs. Continuing to optimize from here to ensure your parameters stay up-to-date will help your team realize continued success.

 

HOW KENWAY CAN HELP

Have you been asking more of your data? Do you ever look at reports or dashboards and wonder, “What if….?” If so, Kenway can help. If you want to know how it would work in your organization, read more here or reach out to us today. 

 

How to Make the Most of Data Visualization

The human brain can process entire images that the eye sees in as little as 13 milliseconds

Regardless if you are the CEO, technology director, or compliance officer within your organization, information in the form of graphs and charts is not only easier to digest, but also promotes data-driven decision-making.

But there is a problem: Most organizations generate massive amounts of organizational data every day that is left unused, taking up storage in its rawest form. Consequently, teams are unaware of the data available to them, and if they are aware of it, they are unsure how to access or interpret it. Scattered and disorganized data requires hours of manual consolidation, cleansing, and validation, and the output is ultimately prone to manual errors. 

If your teams are inundated with spreadsheets and spending an inordinate amount of time on gathering, cleansing,  and reconciling disparate data sources manually rather than providing value-add analysis, then it might be time for a change.

If your organization’s leadership is not leveraging organizational data as a valuable asset to drive proactive risk mitigation and decision-making, then the real question is, “How much am I spending by NOT investing in my data?” 

The cold, hard truth is that organizations can no longer afford to rely on spreadsheets and dirty data to make business decisions; however, data visualization can help to automate the consolidation and aggregation of data, equipping teams with the power to quickly interpret information to drive business results and increase overall team efficiency and satisfaction.

What is Data Visualization?

Before we cover how visualizing your data can help your organization, you may be wondering, what is data visualization? 

Data visualization is the transformation of unstructured or raw data into a visual form to communicate complex data relationships and data-driven insights in a way that is captivating and easy to understand. By succinctly summarizing copious amounts of organizational information into visually appealing reports, teams do not have to dissect and analyze underlying data to understand trends over time.

Data visualization bridges the gap between data and action by providing access to real-time metrics, allowing businesses to be better positioned when it comes to: 

In addition, data visualizations provide leaders the opportunity to harness existing data and leverage it to learn from past mistakes, build on past successes, and anticipate developments that drive innovation and accurately predict future outcomes. 

Key Benefits of Data Visualization

The average organization collects data across 400 different sources. However, about 47% of that data goes completely unused because it is disorganized, unstructured, and dirty, which can cost your organization countless hours and dollars

In order to fully realize the value that your data can offer, good data visualization is imperative. However, investing in data visualization tools and technologies without organizational buy-in or foundational data practices can actually prevent companies from maximizing ROI in the long run. In order to ensure sustainable value realization, you must first establish core data governance practices, clean up data sources, and determine the data needs of your organization.

Once these foundational data practices are in place, data visualization can deliver the following key benefits:

1. Increased Comprehensibility of Data and the Break Down of Data Silos

Because visual data is processed much faster by the human brain, presenting data in an easily consumable format has the incredible ability to streamline organizational production. In contrast to text form, which has more historically been used as the preferred medium for exchanging information, humans can process visual images 60,000x faster. Furthermore, data visualization provides a much more interactive approach to displaying data, thus allowing users to quickly understand the story the data is telling without needing words to provide context. Presenting data to your executives or teams in a visual manner allows for far fewer gaps in communication throughout the enterprise, which can ultimately shorten business meetings by 24% and give you and your teams more time for other value-add initiatives.

Additionally, data visualization can break down data silos within your organization and reduce the amount of time spent on manual reporting. Sixty percent of employees believe they could save six or more hours if static reporting was automated. Business Intelligence tools bridge the gap between siloed data and reporting by utilizing centralized data to display accessible visual reports. Ultimately, implementing a centralized Business Intelligence solution can help prevent wasted efforts on non-value-add activities, while also acting as a catalyst for cross-functional collaboration. 

2. Save Costs & Drive ROI

How can visualizing your data really drive a return on investment? The answer to this question is unique to every organization and depends on the problem you are trying to solve; however, the competitive advantages to investing in Business Intelligence are as follows:

In addition to understanding the tangible benefits of implementing a Business Intelligence solution, it is equally important to note the true costs of not having one. How much will a lack of visibility, process inefficiencies, employee unproductivity, and outdated IT enhancements cost your organization over time? Some potential costs to consider include:

While ROI looks different for every organization, statistics show that data visualization offers an average of $13.01 ROI on every dollar spent. Business Intelligence tools make your data centralized and easily accessible, so employees spend more time on business functions rather than compounding the problem large amounts of data can present.

Data Visualization Best Practices

The quantitative and qualitative benefits of implementing Business Intelligence tools are endless; however, to fully capitalize on your investment in data visualization, you will want to consider these five data visualization best practices:

1. Identify Your Most Critical Data

The first best practice is to establish a core set of data that is most relevant to the entire enterprise. By first defining the business impact you are striving to achieve from implementing data visualization, you can then identify your most critical data element. Are you hoping to:

Once you identify your most critical data elements, you can begin to strategically and actively reduce the volume of data you do not need and acquire new data elements to paint a holistic picture of your organization.

2. Establish Data Governance

Establishing data governance to aggregate and organize by effectively managing data definitions and values is imperative to lay the foundation for sustainable value realization from an investment in data visualization. A few key first steps include:

But these steps are only the beginning. Keep in mind that data management requires ongoing evaluation of data quality to best promote accurate reporting.

3. Implement a Centralized Data Model 

In order to blend data sources into cohesive visualizations, it is best practice to create a centralized data repository. ​​Whether the aggregation occurs within a reporting tool itself or reporting database, it is imperative to blend data sources to provide cross-functional reporting. Benefits of having a centralized data model include:

4. Create a Data-Driven Culture

A key consideration of any digital transformation is to ensure employees within the organization embrace the new technology. In order to increase adoption and combat any resistance to change, it is essential to develop a cultural framework that motivates your employees to leverage the Business Intelligence tools available to them. Of course, this is easier said than done. Today, only 24% of companies admit to having a truly data-driven culture. A few challenges to overcome include:

In spite of these challenges, a data-driven culture is possible to achieve. You set yourself up for success when:

5. Know the Audience

Another data visualization best practice is to know your audience. When designing reports, it is important to understand who is the intended audience of the report and what information the end-user needs. For example, executive-level audiences will require a different level of granularity than employees completing day-to-day tasks. A few dashboards to consider to variate data visualization for different audiences include the following:

Rather than just displaying information that was previously in PowerPoint into a Business Intelligence tool, you can fully harness the power of data visualization by asking questions such as:

From there, you can design the right reports by leveraging data to surface actionable insights and improve business performance.

Success Stories: Implementing Data Visualization Into Your Organization

While there is no one-size-fits-all solution when it comes to visual analytics, at Kenway Consulting, our expertise and obsession with all things data have helped us paint the picture of transformative business opportunities for organizations just like yours. Here are a few examples of how Kenway has used data visualization to help small businesses to large, global enterprises alike.

Gain Insights From Untapped Data

A mobile application company developed an app focused on virtual engagement to provide cultural institutions with enhanced experiences for their visitors. The application collected large amounts of data from its users but had no way to make that data insightful for clients. The organization was looking for an analytics platform that could:

Kenway developed a fully automated, end-to-end process to support the visualizations needed to help the client’s customers understand the value of their data and make informed decisions. The reports created provided insight into who was visiting their institutions, where visitors were spending most of their time, the most-visited areas of the property, and more. 

Read the full case study here.

Establish a Single Source of Truth 

A leading asset management firm had the goal of harnessing massive amounts of data to become more strategic and intentional in targeting its wealth advisor clients. However, due to numerous data inefficiencies and process gaps across the organization, it struggled to support its sales teams in understanding the full breadth of their relationships with current and potential clients. The company lacked real clarity around advisor profiles such as:

The main problems faced by the organization included:

      1. Siloed data sources
      2. Disconnects in organizational communication
      3. Slow and ineffective processes

Kenway collaborated with the business to understand its needs, analyze the current state, and work with technology teams to build a design that would deliver results. To ensure the organization was set up for success and continued growth, we took the unique approach of working together with the asset management company, as opposed to helicoptering in and leaving them with a design and recommendations that were not tailored to their needs. 

This partnership also allowed the asset management company to cultivate institutional knowledge and build in-house capabilities and data visuals needed to support and adapt the modern data platform over time. This focus on enabling critical business outcomes built upon a solid baseline of governance and architectural capabilities helped to ensure sustainability and long-term success.

Read the full case study here.

Visualize Forecasting Data

When it comes to your business, it is better to be proactive rather than reactive. While we cannot predict the future, business forecasting can help you prepare for potential outcomes. Data visualization can be especially helpful in the development of forecasting charts. 

Forecasting charts analyze your descriptive analytics (historical data) over a specific period of time and provide predictive analytics, or trend lines, that extend past the current date to help you predict future business outcomes. Predictable forecasting can be beneficial when trying to:

At Kenway, we offer Business Intelligence solutions like Power Business Intelligence to bring your business-critical insights to life through customized reports and dashboards. Using applications such as What-If Analysis, your organization can plan for best-case and worst-case scenarios over the next 6-12 months. 

Data Visualization
By leveraging historical data as a proxy for future Inventory Aging, we created a dashboard that forecasts inventory aging for a retailer client to predict the concentration of aged inventory for future risk mitigation.
data visualization methods
Using historical revenue trends as a proxy for future revenue predictions, we created a dashboard that allows the finance team to gain insight into future company performance.

Drive Revenue by Increasing Timeliness and Accessibility of Customer Data

Data visualization can help your organization have better insights from customer data to quickly identify and capitalize on new market opportunities.

In order to identify new customers in new markets, you first need to have a strong understanding of your current customer base. Aggregating and cleansing customer data that is spread over a range of disparate sources, such as sales, accounting, and marketing, can be extremely time-consuming and near impossible through conventional methods and excel spreadsheets. Even if you manage to combine various data sources, surfacing meaningful insights based on criteria such as product line, region, demographic, or sales territory can prove to be even more difficult.

Blending disparate customer data within a Business Intelligence tool allows you to create standardized KPIs, metrics, and visuals to better analyze the characteristics of your current customer base in real-time and become more intentional and strategic with your go-to-market strategy. 

Kenway has vast experience in leveraging analytics and data visualization to reveal a 360-degree view of your customers. To make this information even more powerful, Kenway can also blend external and internal datasets to present a macroeconomic view of your internal data trends.

data visualization best practices
Leveraged CRM data to forecast future opportunities based on the expected probability of current sales targets to materialize. 
Data visualization employee interaction map
Leveraging CRM data, we created a network map to show the prospecting synergies across the sales organization for more intelligent targeting.

No matter how savvy your sales organization and business leaders are, their innate ability to identify new opportunities does not compete with a tool that can quickly analyze and consolidate terabytes of data.

Selecting the Best Data Visualizations For Your Organization

There is no denying it: Enterprise data collection is not slowing down. In fact, over the next two years, it is expected to increase at a 42.2% annual growth rate. As the volume and complexity of data caches continue to proliferate, Business Intelligence and data visualization tools will enable your entire organization to consume the information being collected and make proactive business decisions.

Not sure how to navigate the future of your data? Kenway can help. From surveys and polls to decision support tools for the C-suite, our Power Business Intelligence portfolio highlights how our Business Intelligence engagements have helped transform data into consumable, interactive dashboards and reports that drive business-impacting decisions. Request a free data strategy consultation today.

 

Information Insight – Helping to Tell Your Company’s Story

As my colleague, Kevin, recently discussed, we have changed our What at Kenway. That is, we have rebranded our Capabilities and Services in order to provide clarity into the areas where we provide the most value to our clients.

With this change came the need to update some of our messaging — updated elevator speeches, realignment of project summaries, and new ways to describe what we do on a day-to-day basis as a Kenway Consultant.

Starting with our elevator speeches, I worked with some of my teammates to begin updating the ways we informed clients and prospects about Kenway’s value proposition. Unfortunately, we got stuck. We did all the things you assume would be valuable: we reviewed the organizational changes, we looked at introductory messages that other companies presented, we tried switching out the Capabilities and Services in our old messaging with the new ones. Unfortunately, none of it worked.  It all felt over-rehearsed and unauthentic.

At this point, one of my teammates (probably out of frustration), asked aloud, “What is the story we are trying to tell here?” This completely reframed the conversation. Instead of trying to simply describe our new Capabilities and Services, we began discussing the journey on which we wanted to take the listener, what we wanted them to take away from the interaction.

Some of this is reflected in Kevin’s article when he articulates that:

What we learned from this elevator speech exercise was that the important pieces to telling a compelling story were:

These learnings also translate well into how Kenway approaches our Information Insight Capability. More importantly, it allows us to factor in some key missteps such as:

At Kenway, Information Insight entails our Data Management, Data Governance, and Business Intelligence (BI) & Analytics Services. Similar to how people shape an effective story, Data Management helps you collect and index data; Data Governance ensures that the right context in terms of data origination and refinement are used; and BI & Analytics ensure that your data is presented in such a way that it provides actionable information.

Furthermore, we take an iterative, collaborative, and cross-functional approach to our engagements. This helps to avoid the missteps mentioned above. Through iteratively addressing high-impact issues or opportunities, we focus our efforts on key projects and avoid situations where teams gather data for the sake of gathering data — often expending high effort for small problems. We augment this approach with frequent client interaction to ensure that our teams are answering the right questions and optimally focusing their time to deliver value.

Finally, Kenway ensures that, regardless of the core Service driving a project, we bring the right skills, at the right time, in the right volume, for the right duration. This allows us to provide more holistic solutions to meet our clients’ needs. For example, a BI & Analytics project will not only benefit from our ability to model data and create data visualization applications, but it will also leverage Kenway’s knowledge to integrate external data sources through Data Management, and understand the quality and definitions of the data through Data Governance.

By combining our knowledge and technical skills around our Services with an approach that ensures alignment to our clients’ goals, Kenway helps companies to tell their stories. Working together, we help to understand past performance, view what is driving the business today, and gain insight into what opportunities and risks could arise in the future.

We would love to help your organization tell its story! Send us a note at [email protected] or check out the Information Insight page of our website HERE to learn more.

 

Lack of Insight = Increased Frustration

When work becomes frustrating, it can often become all consuming. We’ve all been there. Whether it’s a tight deadline, a difficult client, or a complex deliverable that isn’t going right, the frustration can take over and often lead to decreased productivity.  A few years ago,  that was the situation in my household.

My husband is in medical sales. Like most jobs, he is dependent on data to do his job effectively. At the time, his company did all their sales reporting in Excel. I could often hear him swearing at his computer, because the massive Excel document he needed to access would not open. When it did open, it was extremely slow to navigate, the data was static and filled with errors, and it provided him with very little insight into what was happening in his territory and where he had opportunity for growth. This lack of insight was causing him great frustration.

For those of you familiar with Kenway’s “Why” of “To Help and Be Helped,” you know that we are wired to jump in and solve problems. This scenario was no different. One night after dinner, I sat down with my husband, connected his massive Excel spreadsheet to my personal favorite data analytics platform, i.e. Qlik Sense, and built him some dashboards. Suddenly, he had the insight he was craving, and his frustration quickly turned to excitement.

He shared the prototype with his colleagues, his bosses, and his bosses’ boss. Eventually, my husband’s company asked Kenway to help them build a true reporting solution. Kenway used its iterative approach to Information Insight to help their Sales Organization do the following:

Define Requirements

Kenway spent time with the Sales Organization helping them understand what data was available to them, identify what types of information would provide them insight, and document a set of clear business requirements.

Data Cleansing

The Sales Organization received sales tracing from a variety of distributors of medical supplies. There was little to no consistency in how the distributors formatted the names and addresses of customers, making it nearly impossible to identify a “true customer” and their associated sales. Kenway helped develop a master data management solution to clean the data and create a mapping of “true customers” for the Sales Organization.

Business Intelligence

Using the defined business requirements and the newly cleansed data, Kenway built sales dashboards in an interactive business intelligence application using Qlik Sense to show a holistic view of sales, trends and opportunities.

Predictive Analytics

Kenway leveraged an external data set to define sales potential based on number of surgeries at hospitals across the country. This data was used to predict how much of each product a hospital would buy, if the hospital was buying solely from this Sales Organization. That data was married up with actual sales to help set a strategy and identify opportunities for growth.

After successfully using Qlik Sense for the past year, the Sales Organization was told that their entire company was being forced to migrate to Power BI. Not surprisingly, this concept of change made them uneasy. The level of insight the Qlik Sense application provided was extremely valuable and had led to increased sales. Would Power BI provide the same?

As discussed in our “Data Wars” blog post, there are plenty of business intelligence tools from which to choose, and they are not one size fits all. The key to success is understanding the business requirements and ensuring the data is well formed.

Because Kenway and this Sales Organization had taken the appropriate amount of time to document the business requirements and ensure the data was well formed, it was a fairly easy exercise to determine whether a migration from Qlik Sense to Power BI would meet all requirements. With proper change management, the team was able to smoothly migrate to Power BI and continue to get valuable insight from their new business intelligence tool.

Do you need help unleashing the power of your sales data, or determining which application is best to do it? Kenway can use its Information Insight expertise to help. Reach out to us at [email protected] to learn more.

 

The Cost of Dirty Data

One of Kenway’s core capabilities is Information Insight. Using a combination of our Data Governance, Data Management and Business Intelligence services, we try to help our clients better leverage their data as an asset. The reason that we group these three services together, instead of presenting them separately, is because of our belief that you must have all three to create a robust data environment, which in turn increases our clients’ return on investment (ROI) on their data initiatives. Let’s take a step back and define some terms.

ROI: Regarding Information Insight engagements, our clients’ primary investment is in the effort and time it takes for Kenway to collect, clean and present data according to clients’ requirements, along with any technology expenditures (e.g. environments, software, etc.). The return on this investment is generally comprised of the speed, quality and value of the environments, visualizations and other deliverables that enable a client to make faster and more accurate decisions.

Robust Data Environment: A data environment is considered robust when it equally addresses elements of data governance, data management and business intelligence. This provides an organization the ability to understand:

As you can see below, a key aspect of a robust data environment is balance. Missing aspects of any of these disciplines hampers the value of your environment as a whole.

 

As a budding data analysis provider, whose work has mostly focused on data visualization efforts, I see an obvious question embedded in the above visual. Is it possible to quantify the impact of the presence of data governance on the number of hours it takes to complete a data visualization project? Specifically, if there is a lack of available, clean and trusted data—that is, we have a lot of “dirty data”—how does that impact the effort required for the project? Further, how does this impact the ROI of the project?

To try and answer this question, my colleague, Jon Chua, and I kicked-off our quest with some basic data collection. We first took an inventory of all the data visualization projects Kenway has done over the last six years. We then looked at the number of hours it took to complete each project and proceeded to dig into the details by considering the size and complexity of the scope of each of the projects.

By reaching out to the resources that were involved in the projects, we got an understanding of the current state of the clients’ data governance maturity level at the beginning of each project. Furthermore, we also tried to get a sense of the scope of the project by asking how many dashboards were required and the relative complexity of them.

Having this data in place allowed us to perform some basic exploration. We started by simply plotting the hours taken to complete a project against the number of dashboard views required by the client.

 

Typically, we would expect this to be a smooth and steadily rising line where more dashboard views were related to more hours. However, as you can see in the above graph, that is not the case. We notice a multi-modal distribution, which suggests there are other factors, such as data governance, that contribute to accurately estimating the number of hours.

To try and weigh the impact of data governance, we ran a regression analysis to see the relationship between hours required and the number of dashboards with the presence of data governance. Utilizing R to perform this analysis, we got a resulting model of:

Interpreting the above model, we see that there are at least 98  hours required to complete a project when the required number of visualizations to complete is zero irrespective of the presence/absence of data governance. We can think of this number as a baseline effort required to complete the up-front tasks necessary to create a visualization, such as onboarding, gathering requirements, defining scope, etc.

Next, consider the scenario where the client has formalized data governance. They have the roles and responsibilities defined for data ownership and stewardship, they have a clear understanding of their data sources, and they have mechanisms (i.e. policies, procedures and enforcement) in place to ensure that data quality standards are upheld during data entry. In this case, we observe that the total hours required for each additional dashboard is, on average, 13 () hours

On the other side, the absence of data governance can be incredibly costly. In such a scenario, the number of hours required to create a dashboard is almost tripled to 38. In other words, we observe that, on average, each dashboard required an additional 25 hours ( when data governance was not in place.

Poor data governance, or the absence of it, is one of the leading causes of “dirty data.” Per our analysis, we believe that the effort to collect, catalog and cleanse the data in the absence of data governance has a significant, measurable impact on the overall effort of creating data visualizations. Based on these findings, the up-front cost of setting up a data governance framework at your organization has clear benefits when you factor in the future cost savings!

Want to learn more about data governance? Check out Kenway’s approach HERE or send us a line at [email protected]!

 

Decision Trees, Predictive Models, and Risk Management: When You Miss a $1B+ Impact

Unless you have been living off the grid for the past few weeks, you have likely heard of the unfortunate situation on a United flight from Chicago to Louisville where a paying customer was forcibly “re-accommodated” to another flight to make room for United staff who needed to be transported to Louisville.  While some of the details around the events that transpired may be in dispute, the outcome was indisputably gruesome.  Thus, at one point the following day, United’s market cap had dropped over one billion dollars.  Beyond the cost of the pending lawsuit by the customer who was forcibly removed from the plane, United is facing the cost of the public relations nightmare, and the cost of lost customers who may never fly with them again.  I’m not here to debate United’s policies or procedures.  Rather, I am interested in understanding the $1B+ breakdown that occurred at the intersection of United’s risk management, decision trees, and predictive analytical models.

First: Why do airlines overbook flights?

It is a common industry practice for airlines to overbook some of their flights because events often arise that would result in a fare paying customer failing to board.  This includes things such as late arrivals, missed connections, customer cancellations, and rescheduling, among others.  In some of these cases, the airline may still recover some revenue in the form of cancellation or rescheduling fees.  However, in the case of missed connections, they miss out on revenue if the seat goes unfilled.  Further, as the laws of supply and demand would dictate, airlines often make their greatest margin on the last few seats sold on each flight.  Rather than increasing the cost of tickets across the board, and potentially being less competitive on pricing versus their competition, the airlines have built predictive models to help determine on which flights they should oversell and the quantity to oversell.  These predictive models are heavily based on historical data (i.e. there are certain airports where connections are more likely to be missed), seasonality, and weather, among other factors.  These models are (or at least should be) continually fine-tuned.

Second: Predictive analytics aren’t designed to predict exactly what will happen, they are designed to demonstrate what is most probable to happen. 

Despite the sophisticated predictive models that airlines have created, when flights are oversold, there are times when passengers are denied boarding.  In many cases, it is voluntary (a passenger voluntarily surrenders their ticket in exchange for compensation), but in other less fortunate cases, it is involuntary.  As the chart below demonstrates, most airlines do it--some just do it better than others.

 

Source: fivethirtyeight.com

Third: Where do risk management and decision trees come into play with predictive modeling? 

The predictive model needs to be informed by risk management and decision trees that leverage historical data to determine the acceptable number of seats to oversell to maximize profit on a given flight.  If the model is too aggressive, significant compensation is required for passengers denied boarding, and the flight is less profitable.  If the model isn’t aggressive enough, the opportunity for additional profitability will be missed.

In the case of United, I suspect they grossly underestimated the risk management when assessing the decision tree to create the expected value based on the number of seats to oversell.  For purely illustrative purposes, let’s play out a few scenarios of the risk management and decision tree.  Note, this doesn’t account for the cash value differences in value between flight vouchers and cash or other costs.

Probability of a passenger on a United flight:

Risk probability when passenger is denied boarding (in hypothetical one seat overbooking):

In this model, United would need to adjust its pricing strategy to account for a probable cost of a denied boarding at $3,751.30 per passenger.  If that were true, United would probably sell very few tickets.  I’m of the suspicion that United neither considered the probability nor the adjusted cost of the final risk listed above, and only accounted for $1.30 per passenger in probable costs related to a denied boarding.

Had United accounted properly for the risk management in their model, they might have done a few things differently.  First, they may have tweaked their predictive model to lower the probability that a customer is denied boarding (see Hawaiian Airlines).  Second, they may have changed their process for dealing with overbookings to reduce the likelihood of a customer being involuntarily denied boarding.  News sources indicated that United used random selection to determine the customers that were involuntarily removed from the plane.  While random may equate to “fair”, random also equates to dumb.  Given the information that is available or that could potentially be acquired, it would be borderline stupid for United to randomly select a customer.  Even if they narrowed it down to randomly selecting passengers based upon the lowest flight status, it would still be dumb.  Information can be utilized to make better decisions than random.  Consider the case of Delta (further details here) which allows customers to define in advance what they would be willing to accept to be bumped from a flight.  The decision then becomes far less than random, and more about informed economics.

So, let me ask you: Have you properly accounted for all risks in your predictive models?

Kenway is here to help. If you’d like to learn more about how we help companies build comprehensive predictive models, or if you’d just like to find out more about who we are, please reach out at [email protected].

 

Data Wars: White Paper on In-Memory Reporting Tools

“Just give me the data.” When Kenway Consulting engages in a Business Intelligence (BI) project, many of them begin with that simple phrase— “Just give me the data.” Organizations want their data from various source systems in the hands of their power users. Doing so allows them to leverage the industry expertise and analytical mindsets for which they hired these resources. To maximize our value during a BI project, we believe in getting our clients the data that addresses their highest impact business questions early in the data discovery phase and then iteratively developing it in an in-memory data visualization tool...

 

Recovering Lost Revenue Using Vendor Audit Analytics

 

Vendor and supplier risk management can be tricky—how do you “grade” vendors and suppliers? How do you ensure that contracts are being adhered to? How do you empower your organization’s resources when it comes time for negotiating?

Effective data utilization is a crucial component of vendor and supplier risk management, because it provides valuable data points into contract adherence and other vendor evaluation criteria. In our interactive presentation, you will learn how to Recover Lost Revenue Using Vendor Audit Analytics.

We will walk through an example where data analytics was used to identify newly recognizable revenue for a client, by leveraging vendor sales data, purchasing data, and contract information. We automated the contract audit and helped ensure that negotiated costs structures were being followed -- and when they weren’t -- we recovered (otherwise) lost revenue. We also fixed future transactions, and quickly discovered negotiation opportunities.

Learning Objectives:

 

And the Survey Says….

Kenway just wrapped up a project where we had the opportunity to work towards improving a client’s Net Promoter Score (NPS).  On the surface, the concept is simple:  Customers are asked to provide a ranking based on their experience with a company.  Specifically, on a scale of 1-10, how likely are customers to recommend <Insert Company> to your family and friends?  Customers who give a 9 or 10 are a ”Promoter”, while customers who give a 7 or 8 are “Passive” and those who give a 6 or less are a “Detractor”.  To calculate NPS, simply take the % of Promoters less the % of Detractors.  Anything above 0 means you have more Promoters than Detractors, and you are well on your way to happy customers.  Anything above 50%, and you can consider yourself best in class!

Critics of NPS say it’s too simple, it’s not statistically relevant and there is lack of sufficient evidence to indicate that tracking NPS drives growth.  The critics have a point; however, it’s hard to argue against any process with such a high focus on customer satisfaction.  We are constantly looking to our family and friends for recommendations.  If I were to look at any of my social media accounts right now, I could almost guarantee there is a post asking for recommendations about a company / product, providing reasons why I should never use a company / product, or promoting a company / product I absolutely have to try.

As Kenway pointed out in an earlier blog, ‘Your Information is Worthless’, information is not valuable unless the data is captured and interpreted in the right way.  Calculating your NPS is a valuable way to capture customer feedback as long as companies go beyond simply calculating the metric and actually develop a process to analyze, understand and drive necessary change.  Companies who keep the following best practices in mind will likely see value in capturing their NPS score:

  1. Plan & Identify

Identify which customers you want to include in the survey.  Some companies may decide to target their entire customer base.  However, for starters, a more focused survey may be better.  The goal of any NPS process is to drive business value through customer service excellence.  If you have a specific territory that is preforming poorly or a call center that is having significant issues, then you may want to target customers in those areas.  Think about the key opportunities within your business and start by focusing in those areas.

  1. Design & Develop

Create a design and develop your survey.  Your survey will include the standard NPS question, but how should it be worded?   Again, this will depend on how targeted you are trying to be.  Some companies decide to keep it very generic (e.g. “On a scale of 1-10, 10 being the highest score, how likely are you to recommend <insert company> to a friend or a family member?  If you had to name one thing that we could improve upon, what could that be?”)  Others may choose to target how satisfied customers were with their call center reps and ask a more targeted question (e.g.  “During your last interaction with us, you spoke with one of our Call Center Representatives. On a scale of 1-10, if you had your own company that was focused on service, how likely would you be to hire this person to work for you?  What exactly stood out as being good or bad about this service?” )  Regardless of the approach you decide to take, the key is to follow up with a question that helps you understand what drives your Promoters and Detractors. Understanding these drivers is just as important as the score itself.

  1. Data Integration

In addition to analyzing the results of the survey question, it is very important to integrate your survey data with other key metrics to better understand factors that may or may not be impacting the customer’s satisfaction.  Integrating the survey data with other key metrics such as account information, product mix, eligibility and support metrics will help identify the key drivers behind customer satisfaction.  For example, looking at the product mix of your customers may help determine whether customers with complex product mixes are more likely to be Detractors than Promoters.  Or, looking at support metrics (e.g. first call resolution, time to resolve a ticket, etc.) may help you identify key processes that need improvement.

  1. Measure & React

Now that you have the customer survey data at your finger tips, you have an invaluable opportunity to measure and react.  While companies may not have the ability to reach out to all Detractors, you will find that customers react very favorably when a company reaches out to let their customers know they were heard and have a plan to make sure the same mistakes are not repeated.  Additionally, you will likely start to uncover some quick wins (e.g. a quick change to a call center script or a simple tweak to your website) that will help you see an increase in your NPS score.

  1. Get Proactive

In addition to measuring and reacting, understanding what drives your customers’ satisfaction or dissatisfaction provides a unique opportunity to get proactive in driving increased satisfaction.  Proactive communications are a definite convenience for customers.  For example, Airlines proactively contact customers to let them know about flight delays and severe weather warnings.  Billing departments proactively alert customers when an invoice is coming due to keep them from becoming past due.  Some wireless companies proactively identify customers who are experiencing dropped calls or poor service within their home and send them boosters to improve the service in the home to retain their business.  Using the survey data from the NPS survey can give you insight into the proactive touch points that can be done to not only improve first call resolution, but eliminate contacts altogether.

Implementing a NPS process may not be the right approach for all companies, but customer satisfaction is a key component to sustaining a healthy company.  If you are looking for help to measure and improve customer satisfaction, consider letting Kenway be your guide.  We are here to help.