A Leader’s Guide to Data Analytics
Skip to content
Data AnalyticsStrategy Entrepreneurship Marketing Leadership May 1, 2015

A Leader’s Guide to Data Analytics

A work­ing knowl­edge of data sci­ence can help you lead with confidence.

Data science 101 helps a leader understand data analytics.

Yevgenia Nayberg

Based on insights from

Florian Zettelmeyer

In recent years, data sci­ence has become an essen­tial busi­ness tool. With access to incred­i­ble amounts of data — thanks to advanced com­put­ing and the Inter­net of things” — com­pa­nies are now able to mea­sure every aspect of their oper­a­tions in gran­u­lar detail. But many busi­ness lead­ers, over­whelmed by this con­stant bliz­zard of met­rics, are hes­i­tant to get involved in what they see as a tech­ni­cal process.

Want to learn more? Con­sid­er Flo­ri­an Zettelmeyer’s upcom­ing Lead­ing With Big Data and Ana­lyt­ics pro­gram through Kel­logg Exec­u­tive Education.

For Flo­ri­an Zettelmey­er, a pro­fes­sor of mar­ket­ing and fac­ul­ty direc­tor of the pro­gram on data ana­lyt­ics at the Kel­logg School, man­agers should not view ana­lyt­ics as some­thing that falls beyond their purview. The most impor­tant skills in ana­lyt­ics are not tech­ni­cal skills,” he says. They’re think­ing skills.” Man­ag­ing well with ana­lyt­ics does not require a math genius or mas­ter of com­put­er sci­ence; instead, it requires what Zettelmey­er calls a work­ing knowl­edge” of data sci­ence. This means being able to sep­a­rate good data from bad, and know­ing where pre­cise­ly ana­lyt­ics can add value.

A work­ing knowl­edge of data sci­ence can help lead­ers turn ana­lyt­ics into gen­uine insight. It can also save them from mak­ing deci­sions based on faulty assump­tions. When ana­lyt­ics goes bad,” Zettelmey­er says, the num­ber one rea­son is because data that did not result from an exper­i­ment are pre­sent­ed as if they had,” he says. Even for pre­dic­tive and pre­scrip­tive ana­lyt­ics, if you don’t under­stand exper­i­ments,” he says, you don’t under­stand analytics.”

Start with the Problem

Too often, Zettelmey­er says, man­agers col­lect data with­out know­ing how they will use it. You have to think about the gen­er­a­tion of data as a strate­gic imper­a­tive,” he says. In oth­er words, ana­lyt­ics is not a sep­a­rate busi­ness prac­tice; it has to be inte­grat­ed into the busi­ness plan itself. What­ev­er a com­pa­ny choos­es to mea­sure, the results will only be use­ful if the data col­lec­tion is done with purpose.

You have to think about the gen­er­a­tion of data as a strate­gic imperative.”

Add Insight
to your inbox.

We’ll send you one email a week with content you actually want to read, curated by the Insight team.

Like all sci­en­tif­ic inquiries, ana­lyt­ics needs to start with a ques­tion or prob­lem in mind. Whether it is a soft­ware com­pa­ny that wants to improve its adver­tis­ing cam­paign, or a fast food com­pa­ny that wants to stream­line its glob­al oper­a­tions, the data col­lec­tion has to match the spe­cif­ic busi­ness prob­lem at hand. You can’t just hope that the data that gets inci­den­tal­ly cre­at­ed in the course of busi­ness is the kind of data that’s going to lead to break­throughs,” Zettelmey­er says. While it is obvi­ous that some kinds of data should be col­lect­ed — for exam­ple, con­sumers’ brows­ing behav­ior — cus­tomer inter­ac­tions have to be designed with ana­lyt­ics in mind to ensure that you have the mea­sures you need.”

Nor can man­agers rely on data sci­en­tists to take the lead. Ulti­mate­ly, it is the manager’s job to choose which prob­lems need to be solved and how the com­pa­ny should incor­po­rate ana­lyt­ics into its oper­a­tions. Exec­u­tives, after all, are the ones who have to make deci­sions; there­fore, they should play a cen­tral role in deter­min­ing what to mea­sure and what the num­bers mean to the company’s over­all strategy.

Under­stand the Data-Gen­er­a­tion Process

There is a view out there that because ana­lyt­ics is based on data sci­ence, it some­how rep­re­sents dis­em­bod­ied truth,” Zettelmey­er says. Regret­tably that is just wrong.”

So how can lead­ers learn to dis­tin­guish between good and bad ana­lyt­ics? It all starts with under­stand­ing the data-gen­er­a­tion process,” Zettelmey­er says. You can­not judge the qual­i­ty of the ana­lyt­ics if you don’t have a very clear idea of where the data came from.”

Zettelmey­er says most man­agers share a com­mon behav­ioral bias: when results are pre­sent­ed as hav­ing been achieved through com­pli­cat­ed data ana­lyt­ics, they tend to defer to the experts. There is a real dan­ger in man­agers assum­ing that the analy­sis was done in a rea­son­able way. I think this makes it incred­i­bly impor­tant for man­agers to have a sixth sense for what they can actu­al­ly learn from data.” To make informed deci­sions, he says, it helps to take a step back and estab­lish some fundamentals.

Because ana­lyt­ics often boils down to mak­ing com­par­isons between groups, it is impor­tant to know how those groups are select­ed. For exam­ple, a mar­ket­ing depart­ment may want to judge the effec­tive­ness of an ad by com­par­ing con­sumers who were exposed to the ad with those who were not. If the con­sumers were select­ed ran­dom­ly, the groups are what data sci­en­tists call prob­a­bilis­ti­cal­ly equiv­a­lent,” which is the basis for good ana­lyt­ics. But if, say, they were exposed to the ad because they had shown pri­or inter­est in the prod­uct, this will lead to bad ana­lyt­ics, since not even the most sophis­ti­cat­ed ana­lyt­i­cal tech­niques could pro­vide an answer to the basic ques­tion: Was the ad tru­ly effec­tive or was the con­sumer already interested?

This is not just a mar­ket­ing prob­lem. Take, for exam­ple, a hos­pi­tal that wants to replace its ultra­sound machines. Thanks to advanced wire­less sen­sors, the hos­pi­tal is able to mea­sure in the course of busi­ness exact­ly how long it takes to per­form an exam using the new devices, a met­ric that would help it decide whether to switch over for good. But the data show a sur­pris­ing result: the new device is tak­ing longer to use than the old­er one. What the hos­pi­tal had not account­ed for was a pre­ex­ist­ing dif­fer­ence between two groups of tech­ni­cians: novice tech­ni­cians and expe­ri­enced tech­ni­cians. It turns out that more novice tech­ni­cians, who were nat­u­ral­ly slow­er than the expe­ri­enced ones, were choos­ing to use the new­er device, and this skewed the data. The prob­lem,” Zettelmey­er says, is one of con­found­ing tech­ni­cian expe­ri­ence with the speed of the device.” Again, ana­lyt­ics failed because it over­looked fun­da­men­tal ques­tions: What makes tech­ni­cians choose one machine over the oth­er? Is every­thing about the usage of the two machines com­pa­ra­ble? And if not, was the cor­rect ana­lyt­ics used to cor­rect for that?

Under­stand­ing the data-gen­er­a­tion process can also uncov­er the prob­lem of reverse causal­i­ty. Here, Zettelmey­er points to the case of a com­pa­ny decid­ing whether or not to lim­it pro­mo­tion­al emails. The data reveal that pro­mo­tion­al emails are extreme­ly effec­tive: the more emails a cus­tomer receives, the more pur­chas­es they are like­ly to make. But what is not appar­ent in the data is that the com­pa­ny is fol­low­ing a piece of mar­ket­ing wis­dom Reader’s Digest hit upon decades ago, which found that loy­al cus­tomers — peo­ple who bought more recent­ly, more fre­quent­ly, and who spend more on pur­chas­es — are more like­ly to buy again when they are tar­get­ed. So rather than the num­ber of emails dri­ving the amount of sales, the causal­i­ty actu­al­ly works the oth­er way: the more pur­chas­es cus­tomers make, the more emails they receive. Which means that the data are effec­tive­ly use­less for deter­min­ing whether email dri­ves revenue.

Use Domain Knowledge

In addi­tion to mak­ing sure that data are gen­er­at­ed with ana­lyt­ics in mind, man­agers should use their knowl­edge of the busi­ness to account for strange results. Zettelmey­er rec­om­mends ask­ing the ques­tion: Know­ing what you know about your busi­ness, is there a plau­si­ble expla­na­tion for that result?” Ana­lyt­ics, after all, is not sim­ply a mat­ter of crunch­ing num­bers in a vac­u­um. Data sci­en­tists do not have all the domain exper­tise man­agers have, and ana­lyt­ics is no sub­sti­tute for under­stand­ing the business.

Con­sid­er an auto deal­er­ship that runs a pro­mo­tion in Feb­ru­ary. Based on a rise in sales for that month, the deal­er assumes the pro­mo­tion worked. But,” Zettelmey­er says, let’s say what they were try­ing to sell is a Sub­aru sta­tion wag­on with four-wheel dri­ve, and they com­plete­ly ignored the fact that there was a giant bliz­zard in Feb­ru­ary, which caused more peo­ple to buy sta­tion wag­ons with four-wheel dri­ve.” In cas­es like these, he says, hav­ing the data is not enough.

Know It — Do Not Just Think It

As Zettelmey­er sees it, deci­sion mak­ing in the busi­ness world is being rev­o­lu­tion­ized in the same way that health­care is with the wide­spread adop­tion of evi­dence-based med­i­cine.” As big data and ana­lyt­ics bring about this rev­o­lu­tion, man­agers with a work­ing knowl­edge of data sci­ence will have an edge. Beyond being the gate­keep­ers of their own ana­lyt­ics, lead­ers should ensure that this knowl­edge is shared across their orga­ni­za­tion — a dis­ci­plined, data-lit­er­ate com­pa­ny is one that is like­ly to learn fast and add more val­ue across the board. If we want big data and ana­lyt­ics to suc­ceed, every­one needs to feel that they have a right to ques­tion estab­lished wis­dom,” Zettelmey­er says. There has to be a cul­ture where you can’t get away with think­ing’ as opposed to know­ing.’”

Devel­op­ing such a cul­ture is a big chal­lenge for lead­ers. Orga­ni­za­tions are rarely will­ing to admit the need for change, and few man­agers feel con­fi­dent enough to lead with ana­lyt­ics. This, he says, will have to change.

Can you imag­ine a CFO going to the CEO and say­ing, I don’t real­ly know how to read a bal­ance sheet, but I have some­one on my team who is real­ly good at it.’ We would laugh that per­son out of the room,” Zettelmey­er says. And yet I know a whole bunch of peo­ple in oth­er dis­ci­plines, for exam­ple, mar­ket­ing, who, with­out blink­ing an eye, would go to the CEO and say, This ana­lyt­ics stuff is com­pli­cat­ed. I don’t have a full grasp on it. But I have assem­bled a crack­er­jack ana­lyt­ics team that is going to push us to the next lev­el.’ I think this is an answer that is no longer acceptable.”

Flo­ri­an Zettelmey­er is the Aca­d­e­m­ic Direc­tor for Kel­logg Exec­u­tive Education’s Lead­ing With Big Data and Ana­lyt­ics program.

About the Writer

Drew Calvert is a freelance writer based in Chicago.

Suggested For You

Most Popular


How Are Black – White Bira­cial Peo­ple Per­ceived in Terms of Race?

Under­stand­ing the answer — and why black and white Amer­i­cans’ respons­es may dif­fer — is increas­ing­ly impor­tant in a mul­tira­cial society.


Why Warmth Is the Under­ap­pre­ci­at­ed Skill Lead­ers Need

The case for demon­strat­ing more than just competence.

Most Popular Podcasts


Pod­cast: Our Most Pop­u­lar Advice on Improv­ing Rela­tion­ships with Colleagues

Cowork­ers can make us crazy. Here’s how to han­dle tough situations.

Social Impact

Pod­cast: How You and Your Com­pa­ny Can Lend Exper­tise to a Non­prof­it in Need

Plus: Four ques­tions to con­sid­er before becom­ing a social-impact entrepreneur.


Pod­cast: Attract Rock­star Employ­ees — or Devel­op Your Own

Find­ing and nur­tur­ing high per­form­ers isn’t easy, but it pays off.


Pod­cast: How Music Can Change Our Mood

A Broad­way song­writer and a mar­ket­ing pro­fes­sor dis­cuss the con­nec­tion between our favorite tunes and how they make us feel.