A recurring theme, and the impetus for the creation of our analytics solution, Toolbox, has been the failure rate of typical analytics projects to deliver material value to an organization. Case in point, we were presenting at a supply chain conference recently. We asked a group of roughly 150 VPs of supply chain to raise their hands if their supply chain had an analytics program. Everyone raised their hands. We then asked them to leave their hands in the air if their organization was satisfied with the value that program delivered. Only one hand remained. Gartner contends, in several studies, that circa 60% of all analytics programs are abandoned or fail to deliver the projected ROI. This failure rate is what we have set our sights on resolving with the Toolbox platform and our unique approach to analytics delivery.
One of the challenges with typical analytics deployments that results in the aforementioned failure rate is the speed at which organizations can both deploy analytics and iterate those analytics. In conversations with clients and prospects, we often find there is an existing enterprise analytics project that either has not formally made its way down to the supply chain and distribution network, or it’s mired behind IT resource constraints. This is a huge problem, as the most successful analytics deployments we have seen involve rapid iteration and the ability to quickly pivot.
Before we get into the nuts and bolts of how we attack and view analytics iteration, let’s define and qualify the difference between a simple visualization and something more akin to true analytics or an insight.
First, we view a simple visualization as nothing more than a report. Typically, these are tied to KPIs and show a snapshot of performance against a specific KPI. These visualizations provide no correlation between the metric and other actions or decisions within the facility, an action plan to resolve or improve the metric, nor causation around the reason a specific metric is at its current level.
Conversely, we view analytics and insights as visualizations that provide real value to an organization by drawing the correlation and causation between disparate data elements and actions. For example, a KPI may be able to identify that cost to serve at a specific distribution center has increased, despite the fact that the workforce continues to maintain labor rates. An insight, however, is able to take that singular cost to serve data point and determine that it is caused by an increase in operator travel time, which is in turn caused by ineffective slotting when the facility approaches its inventory capacity. This type of holistic information allows supply chain executives to have pointed conversations with the business about the financial impact of network overcapacity and utilization, which in turn moves the business forward. This is a simple example, but the difference is clear.
At MacGregor, each of our deployments looks to uncover and drive value to the client organization through these types of insights. There is a time and place for simple visualizations tied to KPIs, service levels, and the need to simply know where the business is at any point in time. However, these are nothing more than glorified reports and aren’t able to change behavior and move the business forward. True success in any analytics deployment is the marriage of both simple visualizations and game-changing insights.
Within our analytics deployment process, which hyper-values iteration and constant evolution of these insights, we view insights uniquely. When describing the necessity of insight iteration, I personally use an analogy of a world-class cyclist looking to improve their performance. A world-class cyclist takes a test to assess their maximum oxygen, strength, and stamina levels, as well as their pedal stroke. The “insight” at the end of this assessment highlights that their pedal isn’t homogenous across both legs, leading to inefficiency. The cause, accurately identified as a weakness in their left leg compared to their right, leads to a power loss in their left leg downstroke. A plan is then setup to increase left leg strength and harmonize their pedal stroke. This plan is then executed against, the leg strength is unified, and additional performance is extracted. The team declares success, but they aren’t done. They simply have to find the next area to improve as that pedal stroke insight is now irrelevant and obsolete. The cyclist and their team must now find the next item to continue their march towards a tour de France victory, but finding the next item to improve.
This principle is similarly applied to supply chain analytics deployments. At MacGregor, we know that if we successfully move the needle and really drive value to our clients in the form of impactful insights, the client will solve the problems we uncover and look to take the next step the proverbial path towards perfection. Without continued iteration and evolution, they remain stagnant with nothing more than a report to tell them how they performed yesterday.
Our deployment process ensures each Toolbox customer properly respects the trappings of simple visualizations and that the path to true success is paved with constant iteration and improvement.
- Interested in getting more than KPIs, metrics, and service levels from your supply chain analytics deployment?
- Ready to raise your hand at a tradeshow or speaking engagement because your analytics deployment has been wildly successful?
Take the next step and reach out to our analytics team to learn more about our process, people, and platform that is redefining what a successful supply chain analytics deployment looks like here.
Toolbox – The new standard in supply chain analytics deployments.,