Leadership Analytics – Big Data Project Performance Goes Scientific

In the brave new world of business analytics fueled by big data, there has been significant discussion about the evolving roles of C-suite executives, including the CEO, CTO, and CIO. That discussion is now expanding to include the CMO (Chief Marketing Officer) plus the new roles of CDO (Chief Data Officer) and CDS (Chief Data Scientist). I do not have an MBA and I usually don’t undertake risky behavior, such as telling a CEO how to run her or his business. However, it is entirely appropriate for the CMO, CDO, and CDS to step up to the challenges of leading and directing the analytics, big data, and data science efforts of their organization, respectively. It is also appropriate (and should be in the job description) for these execs to stand firm against corporate cultures and naysayers that resist big data analytics projects with these types of remarks:  a) “Let’s wait and see how it develops elsewhere”; b) “We have always done big data”; or c) “What’s the ROI? Show me the numbers.”

I want to examine here the concept of leadership analytics. This differs from analytics leadership (which focuses on the characteristics of the leader) by focusing on the characteristics of the leader’s performance (specifically, project performance under the leader’s direction). Similar to developments in the emerging fields of HR analytics and talent analytics that use big data and data science methods (with quantifiable metrics) to identify, hire, engage, and reward good employees, it is also reasonable to apply the same quantifiable methodology to project leadership performance: that’s leadership analytics.

Words of wisdom

Principles of leadership are a dime a dozen. There are countless inspirational books, words of wisdom, and pithy witticisms that tell you how to be a good leader, what are the characteristics of good leaders, how to lead, when to lead, etc. One of my favorite quotes is this: “Good judgment comes from experience, and experience comes from bad judgment.” Another favorite is attributed to the “great one”, hockey legend Wayne Gretzky: “Skate to where the puck is going to be, not to where it has been.” Then there’s this one (from Gretzky, or Michael Jordan): “You miss 100% of the shots you don’t take.” I will look at each of those three quotes from the perspective of leadership, particularly in the big data analytics environment. Hopefully, this will rebut the three naysayer remarks listed above, respectively, while also establishing a basic framework for leadership analytics.

“Good judgment comes from experience, and experience comes from bad judgment”

Data science is (or should be) primarily about the science! This means that the process follows rigorous scientific methodology:  infer a hypothesis, design a test (experiment), run the experiment, analyze the results, determine accuracy and error, and refine the hypothesis (if necessary). The scientific method is really a “scientific process cycle”—it repeats, until you are satisfied with the results (or run out of resources for more experiments). It is entirely acceptable to start with a simple hypothesis, which is likely to have poor accuracy and systematic errors (biases). You learn from the mistakes and errors that this simple model produces, and you then refine the hypothesis for the next round. An analysis of the errors should lead to better judgment in subsequent hypothesis generation and/or experimental design. Consequently, it is not surprising that successful projects are often measured by the depth and quality of the lessons learned, best practices, and continuous improvements that were produced and documented (i.e., how much “bad judgment” needed to be overcome in order to arrive at a good result?). If you have none of those, then who will trust your process when there are bigger business goals at stake?

“Skate to where the puck is going to be, not to where it has been”

Predictive analytics is about the future—where the customer will be, what the customer will buy, when a device will fail, what the outcome will be from some intervention (either in healthcare, education, or government policy). It is “old school” business intelligence to report where the business has been. Such reports are valuable and necessary, but in the new world of big data analytics, leaders want to know what is most likely to occur next, before it happens.

One of the most common use cases of predictive analytics is in recommender systems. These systems are seen in nearly all online stores now—products are recommended to the customer, based on a variety of similarity calculations: How similar are this customer’s tastes and interests to other customers who also viewed/bought this item? How similar (or semantically connected) are other products in our inventory to the product that the customer is now viewing? How does this customer’s interaction with us change as a function of context (e.g., time of day, the browser that they are using [mobile or desktop], their geolocation, their IP address). A new O’Reilly book on “Practical Machine Learning – Innovations in Recommendation” (by Ted Dunning [Chief Application Architect at MapR] and Ellen Friedman) looks at this wonderful world of recommendation, predictive analytics, and “skating to where the puck is going to be.” (I will discuss more aspects of this book in future blogs.) Your analytics project’s performance can be measured by increases in the rate of successful conversions from a new recommendation engine, or by improved precision in a predictive model’s accuracy, or by how far the target of your analysis has moved from where you were (descriptive analytics) to where you are going (predictive analytics).

“You miss 100% of the shots you don’t take”

Brilliant ideas are useless without a goal and taking a shot at that goal. Your goal can take many forms:  discover something new (in science); make a better decision (in government); take a more effective course of action (in business); provide a better experience (for your customers); deliver a better diagnosis (in medicine), etc. In any case, if you don’t try it (even if you fail), you haven’t gained anything from your big data analytics investments. I prefer to define ROI as “return on innovation”.  In other words, have you taken your shots at new ideas, new products, new customers, new markets, new understandings, or new models? Have you scored on some of those shots? Can you point to innovations that have come from your efforts? Of course, we care about how much is spent on failed efforts. However, it is not the percentage of successes that matter, but the existence of successes.  Measure them, report them, improve upon them, keep taking shots, improve your analytics “game”, and deliver the ROI that really matters.

Grace Hopper said it…

In closing, I would like to add one more word of wisdom on leadership that can be applied to big data analytics. It comes from Admiral Grace Hopper: The most dangerous phrase in the language is, `We've always done it this way.’” It takes leadership to go a new way, especially into unchartered waters, and big data analytics are now providing ample opportunities for corporate leaders to do exactly that.



Streaming Data Architecture:

New Designs Using Apache Kafka and MapR Streams




Download for free