In my previous post (The First Level of Analytics for the Data Driven Company), we defined the basic building blocks of descriptive analytics and introduced the first of three applied descriptive analytic examples to help drive value for your company and make your decision making better informed and more data driven.

The goal of increasing efficiency in a systematic matter, fundamentally, is a commitment to the philosophy and process of continuous improvement. Measure, analyze, respond, act, measure, analyze, respond. Repeat.

The first applied descriptive analytic we reviewed last post was The Reports Audit and Data Model Creation or Optimization where we focused on how “report creep” can saddle your organization with a costly drag on employee productivity. By establishing a data model that classifies and organizes your data elements, you can realize huge gains in FTE hours saved in your organization’s reporting infrastructure.

Here are two additional applied descriptive analytic examples for your review:

Applied Descriptive Analytic #2: Key Performance Indicator (KPI) Review and Testing

Real-life Example: We recently helped a company that provided social services to various states, primarily placing at-risk kids into home solutions (adoption, foster, family, etc.). We were discussing how analytics could help increase their efficiency and one of the clients voiced a particular problem they faced.

Periodically, they had to provide to the states they contracted with a summary of where the kids were in the system at a specific point of time (i.e. just entering, placed in care, receiving counseling or other services, existing the system). We were astounded to hear that it took two to three weeks to assemble the information needed to populate this report. Since the company depended on these state contracts, this report qualified as a KPI and was both essential and strategic for their future growth.

Think about this example – the company produced hundreds of reports to internal and external audiences but had to manually gather data to answer what many of us on the outside world would expect to be a reasonable and even common information request. These folks were simply not tracking the right KPIs.

Your organization’s KPIs should be testable – they should be able to demonstrate that they are predictive of the outcomes you’re trying to achieve. One common problem we see are KPIs created from the bottom up – in other words because a given metric is available on a report, it becomes de facto selected.  Useful KPIs should be created from the top-down. The executive and leadership team should be responsible for determining the strategic definitions  of success, and the metrics defining these conditions should be either selected or created. Again, having a defined data model gives you a leg up as the analytics’ team can test to confirm the metrics are statistically verifiable in terms of the outcomes you’ve defined as positive.

One last comment – KPIs are never fixed. They, like everything we’re discussing, are subject to a process of continuous improvement. We’ve worked with companies that quarterly review their KPIs and question what has changed in the make-up of the company since the last financial period? Do the critical KPIs still reflect the desirable strategic goals. If you’re not asking these questions, over time your KPIs will drift, and you’ll find yourself wasting time, money and hours monitoring the wrong metrics.

Applied Description Analytic #3: Attribution Analysis

Real-life Example: We worked with a client in Hong Kong that was spending millions of dollars on multiple on-line and off-line channels to sell a product. They wanted to know if they were allocating their funds to optimize their growth. We worked with their digital marketing team to examine the reporting streams associated with each of the digital channels and built an attribution model to measure contribution.

What we found in the data was interesting. The digital team was seeing a much higher number of arrivals to their website from several social media channels compared to more direct digital marketing channels. The contribution of social to conversion was also higher than the direct channel options, even though the latter cost more. But why?

By examining the specific content in the digital channels using various listening platforms, we discovered a fascinating trend: the social chatter revolved around people discussing a series of advertisements the institution had placed on billboards affixed to city buses. The selection of male and female models and the clothes and accessories they were wearing had caught their attention which translated to sharing, more pull-through visits to their website and sales of the product.

In this example, the attribution model not only showed that the cheapest digital channel was contributing to more conversions, further analysis of the results uncovered the root cause of the increased conversions was due to an inexpensive series of traditional ads. And yes, the featured male and female models got more work! Thanks to the power of analytic attribution modeling. The client also benefited as they increased their marketing efficiency by realigning their spend to take advantage of this information.

These are just a couple more examples of how applied descriptive analytics can help drive value for your company and make your decision-making better informed and more data-driven. In our next blog series, we’ll shift gears and look into the realm of Predictive Analytics and how data builds on data as companies increase their depth and maturity in applying analytics to their business.

Article by Chris Schultz, a principal at Analytic Marketing Innovations (www.analyticmarketinginnovations) and a RUMBLE Strategic Partner. Their solutions delivery approach identifies executable steps and recommends both near-term and long-term courses of actions, helping your business leverage data insights for growth and transformation.

A recent Forbes article by Maciej Kranz ( points out that IoT adopton has been “more complex, costlier and riskier” than anticipated, making it a slower process than was originally predicted.  Kranz looks at four key differences between expectation and reality.  When IoT was introduced in 1999, and as it has taken shape over the last 10 years, people were promised an idealistic world of technology.  Many expected that the IoT industry would be further developed at this point.  Kranz details key predictions vs today’s reality and explains how businesses can prepare for “IoT’s continued evolution.”

Prediction 1: The IoT will be an overnight sensation

The first is the rate at which the IoT industry would grow.  Initially forecasted that there would be 50 billion devices by 2020, that number has since been lowered to 20 billion.  This is partially due to many companies encountering barriers when trying to implement IoT.  The primary barrier has been cost and speed of implementation.  Other roadblocks Kranz mentions is the recalibration of sensors, integration into legacy infrastructures, and the need for heavy customization.

Prediction 2: Vendors thought they could go it alone

Vendors expected to be able to build vertically and horizontally with sensors and software.  But they have had to refocus on their core capabilities and customers have now become the driving force as to how and why IoT is implemented.  The consumers have pushed various specialists to work together to deliver a solution. “IoT requires collaboration.”

Solutions built on data collected and analyzed through IoT devices are dramatically improving operations of many companies while enabling others to create new value propositions, new services, new revenue streams and new business models. Although some of the predictions of the IoT didn’t quite pan out the way we had envisioned, businesses must take note of the realities and adjust expectations and approaches accordingly.  –Maciej Kranz, Forbes Councils

Prediction 3: IoT technology would be seamlessly interconnected

When it was first introduced, IoT was an idealistic solution with billions of devices interconnected.  People did not expect to struggle with connecting the digital and physical worlds. Issues making connections led to vendor groups working together to set standards. In the industrial market, OPC/UA is becoming the common ground.

Prediction 4: Traditional security solutions would be enough

The final topic Kranz covered was security.  It was assumed that old OT security tools and generic IT tools could operate and secure new IoT technology.  Since, we have learned that an integrated architectural approach is the best security strategy. And to be even more specific, one flexible security architecture for the entire enterprise that is multi-vendor and developed jointly by customers and horizontal/vertical specialists. Security is still one of the greatest obstacles to IoT adoption.

The learning process has been slower than expected with implementation, but this is just part of the growing pains of a new industry. That being said, IoT is helping and enabling growth across many industries. The misalignment of expectations is an example of the overestimate effect in the short run and underestimate in long run. Kranz concludes that he “remains steadfast that the future (of IoT) will be incredibly bright.”




In my last post (Becoming Data Driven), we began a discussion about the goal of becoming a data driven organization. We determined it’s not so much about the tools as it is about leadership, philosophy, and decision processes of a company that help to reach a data-driven state.

If you are data driven then your analytic tools and insights are helping you drive another dollar of revenue, reduce another dollar of expense, find ways to do more with less, and secure your future against disruption.

As part of the post, we introduced the Business Threat Assessment (BTA), a mechanism used by business leaders interested in being more data driven. The outcomes of this assessment are a list of three tactical threats, three strategic threats, and the three most persistent challenges to an operation’s efficiency. It’s a way to get organized by establishing a meaningful priority set that should be evergreen. The BTA is as much a way of thinking as it is a process, and it should be scalable up and down your team. Your managers should be able to weigh in with their interpretation as it relates to their areas of responsibility.

In this post, we will begin to leverage the outputs of your Business Threat Assessment by introducing how to leverage the first level of analytics – descriptive analytics – to better align your data to your fundamental business goals.

Descriptive analytics encompasses how your current and historical data sets are produced, manipulated, and displayed (as compared to predictive, or forward-looking tools). Here are the foundational elements a company should include, in order of complexity, to develop a successful descriptive analytics solution:

  1. The Data Assets – All the data generated and captured in your operation, your company’s ocean of data, so to speak. In my first RUMBLE blog post (The Data Map: The Road to Managing Data as an Asset), we talked about creating a data map that inventoried these assets and this is a necessary first step for any company to pursue.
  2. The Data Management Infrastructure – The various repositories where you are storing data today – it might be organized, partly organized or not organized at all.
  3. The Data Presentation Layer – Your existing reports. 
  4. A Data Model – An overlay that ties all your data elements together, defines their types and values, and illuminates the relationship and sequencing between them. This can be as-built, a reflection of what has grown over time, or optimized (detailed later).
  5. A Data Archive – An advancement over “run of the mill” data management and storage, a Data Archive constitutes a designed repository and database infrastructure that typically integrates and organizes your data elements into an efficient structure that is more easily accessed and manipulated for reporting and analytics.
  6. A Business Intelligence Tool – These tool kits (Tableau, Qlikview, Ateryx and Microsoft Power BI) optimize the visual display of data and reporting by integrating user configurable dashboards, reporting schedulers, and distribution and publication functionalities.
  7. An Analytic Data Set and Toolkit – A specialized data repository created by your analytics’ team that is populated by the critical data element subsets most relevant to your analytic requirements. This advanced approach has been statistically validated through exploratory data analysis as the most useful subset for query and investigation.

In short, descriptive analytics manipulates your current and historical data assets we’ve listed above (including your reports) to make more effective business decisions possible.

Descriptive Analytics Applied

The goal of increasing efficiency in a systematic matter is a commitment to the philosophy and process of continuous improvement. Measure, analyze, respond, act, measure, analyze, respond. Repeat.

Common initiatives that fall under the applied descriptive analytics proven to increase efficiency include:

  • Reports Audit and Data Model Optimization
  • KPI Review and Testing
  • Attribution Analysis

In this blog, we’ll start with a more in-depth analysis of the first example.

  1. The Reports Audit and Data Model Creation or Optimization

Real-life Example: A client had decided to implement a new Business Intelligent (BI) tool and requested our help migrating the reporting infrastructure of their investment accounting team. We discovered a reporting infrastructure of over 200 spreadsheet reports that had accrued over the previous decade. Each was hand-built, hand-operated and tied to critical processes of month-end and quarter-end close cycles.

There was no data model, and data feeds driving these reports came from over 50 discrete sources. We conducted a reporting audit and found that many of the report elements overlapped. We were able to create a data model that established the key sources and data elements, including their relationships and location, and which populated a data repository we built for the BI tool. The results were a 50% reduction in the number of reports with an equivalent hours saved of 1.5 FTE headcount. 

Most companies can benefit from a process like this. Understanding the type, frequency, and audience of all the reports you produce allows you to establish control and impose efficiency where it may not be currently be present. If you can’t point to a data model that classifies and organizes your data elements, you can’t control the evolution of report creep. A good place to start is to consciously review the time, dollars and staff you have supporting your current reporting structure. This investment in TIME should provide you an ROI urgency to ensure a data map is created and optimized.

In my next blog I’ll introduce two additional applied diagnostic analytics examples: KPI Review and Testing and Attribution Analysis. Both will include real-life examples of the how organizations benefited from their use.

(Article by Chris Schultz, a principal at Analytic Marketing Innovations (AMI) and a RUMBLE Strategic Partner. Their solutions delivery approach identifies executable steps and recommends both near-term and long-term courses of actions, helping your business leverage data insights for growth and transformation.)

RUMBLE strategic partner, Cradlepoint, recently sponsored a white paper (A Sensible Approach to Smart City Projects) showing the most successful smart city initiatives have been those that started small.

“Some companies will come and try to create a global dashboard that tries to take data from every part of a city and analyze it,” said Ken Hosac, vice president of IoT strategy and business development, Cradlepoint. “The approach we’ve been taking is to find a specific targeted use case and make that use case better, or enable a use case that didn’t exist before.”

This more “sensible approach” of refining one service at a time instead of undertaking a massive project upfront has overcome the hesitancy of city managers taking advantage of smart city technologies.

The White Paper explores various facets of smart city initiatives, including:

  • Examples of smart city technologies
  • The potential benefits of smart cities to enhance the quality of living for its citizens
  • Common and pragmatic first steps in embarking upon smart city initiatives
  • Smart city pitfalls to avoid

Cradlepoint is as strategic partner of RUMBLE and the global leader in cloud-delivered wireless edge solutions for branch, mobile, and IoT networks. RUMBLE improves business performance by designing and deploying custom end-to-end IoT solutions with ROI built-in. To discuss your specific smart city challenge, click Ready to Rumble.

In my last blog post (The Data Map—The Road to Managing Data as an Asset), I provided advice on how an organization can go about building a data map in order to establish a baseline of the data assets underlying their enterprise. Now, before we look at organizing those assets for analytic purposes, let’s talk about a different challenge that is foundational to everything else—becoming data driven. You probably hear this phrase a lot but may not necessarily see a lot of useful discussion about what it means, in practical terms. 

Let me start with my own personal core belief about this. Becoming data driven has a lot LESS to do with software, tools, BI, analytic techniques and wonder weapons, and a lot MORE to do with your organization’s culture, leadership, and philosophy about decision making and decision support.

Things that do not make your company data driven:

  • Buying an expensive business intelligence tool and the necessary usurious licenses and installing it on a multitude of desktops.
  • Hiring data scientists and creating CDO titles.
  • Creating a data mart, warehouse, or lake and accumulating metric tons of data.

Most of these things may have a place in a company’s evolution toward being data driven, and they may be necessary (at some point), but they are definitively not sufficient (at any point) to drive the transformation. Put another way, in my years as an analytics consultant, I have worked with many smaller companies whose toolkits were basically just Excel based and were far, far more data driven than giant enterprises that had spent millions on the items illustrated above and were failing to address the core changes necessary to actually utilize them.

Being data driven means having what fighter pilots call maximum situational awareness— striving for near perfect clarity on the state of your operation, and relentlessly seeking highly informed insight into what is likely to come in the near, intermediate, and longer-term future.

In the absence of leadership making the difficult changes to their operational processes, companies don’t fully utilize the capabilities these toolkits deliver in a data driven manner. You fundamentally have to a) trust the data and b) be willing to have the courage of your convictions to drive the outcomes the analytics illuminate. Those convictions are often torpedoed by leadership-centric issues of politics, expediency, procrastination, or cults of personality. I have worked for companies where that list constitutes the entire operational methodology. We laugh, but everybody reading this knows it’s true, and probably sees some of it at their own company every day. If you’re a leader, and your reaction is “not at my company,” well, good luck!

Let’s not kid ourselves. These are very common problems, to a greater or lesser degree, at many companies. The kinds of organizational behaviors and dysfunctions are the biggest barrier to becoming data driven, not the lack of shiny tools and cool software.

So, this begs the question—what is the CEO, CMO or COO who is truly committed to making this happen supposed to do?

At its very core, becoming data driven means being fact-driven. It means making more efficient, informed decisions. It means having what fighter pilots call maximum situational awareness—striving for near perfect clarity on the state of your operation, and relentlessly seeking highly informed insight into what is likely to come in the near, intermediate, and longer-term future. It means embracing measurement and celebrating the results—both good and bad. Those qualifiers, by the way, are probably holding you back right now. What you want, as a manager, is accuracy—and if you want to get your people in the habit of thinking that way, you should be substituting “accurate” and “inaccurate” as your key descriptors for your numbers. Don’t punish people, at all costs, for bringing you numbers or analysis that you don’t like—if it is accurate. Reward honesty in measurement, regardless of the relative interpretation.

If you are committed to reaching this goal of becoming data driven, then you are likely going to take your company on a journey through the three levels of analytics:

Descriptive Analytics: Focused on maximizing the utility of the datasets generated by your current operation, supplemented with other data sources, to maximize efficiency.

Predictive Analytics: Deploying tools that will allow your operation to anticipate customer needs and to model forecasts and scenarios of possible business scenarios (product launches, for example).

Prescriptive Analytics: Currently much debated in definition, but grounded in the implementation of advanced AI and machine learning techniques to address complex, multi-variate questions. Characterized by a state of maximum automation, it can be thought of as the point where smart machines begin to manage much of the operational decision making in an enterprise.

So, how to begin that journey?

Issue an RFP for an advanced BI tool, right? WRONG! You have homework to do, my friend. Developing the roadmap that will eventually guide you through this journey means coming back to the core of what being data driven means—understanding the current state of your operation, and what the priorities are for you to achieve.

  1. Drive revenue and growth
  2. Reduce expense and grow margin
  3. Increase the efficiency of the operation (in many cases, cost avoidance rather than cost reduction)

To make these things happen, in a data-driven way, look at what barriers are blocking progress across these strategic goals. I advise a company to start with a very straightforward exercise—the Business Threat Assessment.

Becoming Data Driven: Step 1—The Business Threat Assessment

This is the foundational step to all that follows. It establishes the priorities that analytic solutions need to address, and it’s entirely in the control of the company to achieve. The company needs to answer three fundamental questions:

  1. What are the three greatest tactical (next 1 to 2-year horizon) threats to the operation’s success?
  2. What are the three greatest strategic (next 3 to 5-year horizon) threats to the operation?
  3. What are the three greatest, persistent operational issues the company seems to face, year after year?

Some words of advice about this: If, upon reading this list, your first impulse is to reach for the phone and call a highpowered (expensive) business consultant to come in and execute this, you’re already off the rails. This is an exercise that any company should be able to accomplish without any external help—and if you can’t, you have bigger problems than analytics can fix. Get the bright leaders in your company to spend a day on this. And, if your feeling is “I can’t trust this to be done right,” then do pick up the phone, call a recruiter and get on top of your real issue.

Let’s be honest. Anybody in a leadership position should have a pretty good idea of the answer to these questions. If you’re not talking about them today, in a regular fashion, then your first step on the way to becoming data driven is to institutionalize this list, refresh it on a monthly basis, and focus the leadership team on addressing it.

Why is this first step necessary?

Because being data driven means committing to a process of continuous improvement. As leaders and managers, you are prioritizing your people, their assets, and their efforts. If you’re not clear on the size, pressure and importance of the challenges facing the enterprise, you’re not able to task anybody effectively. You should be developing plans that deliver the maximum return to the business along the three key metrics we’ve discussed (growth/cost/efficiency), and those plans are NOT one and done. They are a continuous, reinforced, optimized set of decisions that are consciously selected to deliver maximum, measurable return.


In our next post, we’ll talk about how to take the outputs of the BTA, and do the exercise of asking the question, “How can my current data assets and KPIs help me address these challenges?” We’ll be in the land of descriptive analytics and talk about taking a hard look at your current reporting infrastructure before you spend a dollar to change it.


(Article by Chris Schultz, a principal at Analytic Marketing Innovations (AMI) and a RUMBLE Strategic Partner. Their solutions delivery approach identifies executable steps and recommends both near-term and long-term courses of actions, helping your business leverage data insights for growth and transformation.)