Top Tips For Influencing People With Data


The evening before the space shuttle Challenger explosion, scientists at NASA caught what they thought was a potentially catastrophic risk with the o-rings considering the unusually cold temperature expected for the morning’s launch. They brought the issue to management attention but failed to influence the final decision enough to stop the launch. Your failure to influence may not cost lives but it could be “catastrophic”.

We make the vast majority of our decisions with pattern matching. This is hard-wired into humans because seeing patterns can mean survival. The most effective communication not only takes this fact into account, but leverages human psychology to an advantage.

Firm Up Your ‘Soft’ Skills

1. Tell a good story. We quants are more comfortable debunking the generalizability of anecdotal evidence than telling stories. However, humans are impacted much more by stories than facts, so give your audience a narrative that they will remember.

Example: Rather than list the mean values for a set of numbers about shopper choices, say, “Mary is a typical shopper. She walks into a store and she …”

2. Become known for being right. In the world of getting people to listen to you, nothing succeeds like success. Look for opportunities to make relatively safe predictions — cases where it’s not a matter of “if” but merely a matter of “when”. Especially look for situations where your prediction is surprising or where it’s in a blind spot of your intended audience.

Example: While working on my PhD, I wrote a paper which predicted a means of cyber-attack. It sat on the shelf somewhere at the funding agency until three years later when that prediction came true. I now sit on an advisory board for a DoD funded cyber-security research.

3. Avoid wars about semantics. Few things waste time and energy more than intellectual skirmishes on a trivial point. Instead, use it as an opportunity to connect.

Example: When challenged purely on your use of a word, agree that there’s probably a better word available, and ask everyone to temporarily use a made-up word like “zorch.” Replace every instance of the challenged word with “zorch,” then define what you mean by it. The best word will eventually become obvious, and the process can contribute to harmony rather than discord.

4. Imperfect evidence is better than no evidence. Many people don’t want to use imperfect or partial evidence to make a decision. That’s a mistake, because even a single piece of incomplete evidence can improve the accuracy of a 50-50 decision enough to make it a 75-25 decision.

Example: A box of 100 marbles contains a random mix of green and red marbles. The proportion can vary between 1%-100% of one colour. You have to guess which colour is more prevalent. Believe it or not, using the colour of just one sample marble from the box to make your decision can improve the accuracy of your guess from 50% to 75%.


5. … But don’t over-value evidence. In the example above, remember that the evidence can lead you to the wrong conclusion 25% of the time. Do not discount the evidence contained in the heads of your audience. Their contextual experience may actually override your evidence… and even when it doesn’t, your job includes overcoming their incorrect biases.

Make An Impact With Visualisation

Here is a list of my top tips for the creation of successful visualisations:

6. Answer the question, “Compared with what?” Effective visualisations give a reference, a context, to understand what you’re looking at and what it means. Pitching a baseball 50 miles per hour is fast if you’re on the sandlot — but a likely home-run for a pro ball player who regularly has to swing at a 90 mph fastball.

Example: If I tell you that a baseball player has a batting average of .260, do you know if that’s good or bad? Now, if you know that the median batting average of all baseball players is .255, you can say whether .260 is good or bad. Now compare that to Papi’s (David Ortiz) 2013 World Series batting average of .688.

7. Show causality, or are at least informed by it. Visualisations are there to help us understand what happened in the past and what is likely to happen in the future, as well as how likely that future result is. To be useful, we need to know what caused, causes, or will cause a particular result so it can be repeated or avoided.

Example: The primary chart used by the NASA scientists showed O-ring failure indicators by launch date. Tufte’s alternative shows the same data by the critical factor, number of failures at a given temperature. The fateful shuttle launch occurred at 36 degree but Tufte’s visualisation makes it obvious that there is great risk for any launch at temperatures below 66 degrees.

8. Allow you to see the forest AND the trees. Good visualisations inform you about detail and the big picture so that you make the right decisions–and avoid the really bad ones.

Example: If the average time to resolve a zorch problem is 15 hours, you might get the impression that you can plan on 15 hours each. To improve, you’d try to reduce the time spent on each one. But looking at the actual distribution you see that 9 out of 10 zorches are resolved in only 1 hour, but the tenth takes 140 hours! If you hadn’t seen the “trees” in addition to the “forest,” you’d have wasted your time–and money–on the wrong problem.

Larry Maccherone

Larry Maccherone is the Director of Analytics for Rally Software. He has been with Rally since 2009, first as an Agile coach and then as a Product Owner of the analytics team. Prior to his current role, Larry served as a software process coach focusing primarily on Scrum and the SEI's Team Software Process (TSP). He obtained qualifications and certifications in a number of software process and quality practices including the SEI’s Personal Software Process (PSP) instructor, TSP launch coach, CMMI Assessor, ISO-9000, Rational Unified Process, and Certified Scrum Master (CSM). Prior to Rally, Larry served as Chief Engineer and CEO of Comprehensive Computer Solutions, a systems integrator for factory floor automation, and was founder of QualTrax, software for measurement and management for ISO-9000 and other standards compliance. Larry is currently finishing work on his Ph. D. in Software Engineering at Carnegie Mellon. His research focuses on agile measurement, analysis, and visualisation for software and systems engineering.