from: Bloodstone43956@i-mail.irs
to: Raventrap39996@i-mail.irs
date: 7518.19206
My Dear Raventrap ~
I thoroughly enjoyed your story about the way one of your clients lost his life savings in a card game after you convinced him the odds were now in his favor. The little fool actually thought his luck was about to change. I would have laughed until I cried, if we could cry about anything. Many humans obviously don’t understand chance occurrences. The odds of a particular coin-toss coming out heads are 50-50 ~ no matter how many times in a row the coin has already landed on heads. The odds of your client filling an inside straight did not change based on the fact that he had never done so before. It seems you have seen to it he now thinks Our Competitor is out to get him. His loss is Our Executive’s gain.
Another lovely form of deception involves lying with statistics. Since relatively few of our clients understand math, and even fewer understand statistics, this kind of lie is not difficult to pull off. Basically, there are five main tactics, any one of which might do the trick, and they are absolutely brilliant in various combinations.
First, get your presenter-clients to tamper with the scale of charts and graphs. It’s easy to make small effects look large and contrariwise if the scale of the chart is designed to do so. Your might coach your presenter-clients to make a change of 1% look like ten times that much if you want to scare your observer-clients into believing there is a big problem or make them think substantial gains have been made. On the other hand, if you want to minimize the problem or show that minimal progress is being made, just advise your client to compress the scale. For example, we have already been quite successful in getting stock brokers to present graphs of market performance this way to entice or discourage investors. You see how it works? The numbers haven’t changed, only the manner of displaying them. And since they are numbers, many of our clients will believe they are important, and will act on the “data” presented in the chart or graph. But be careful. Some clients will remain wary that past performance doesn’t necessarily predict future results.
Second, make sure as much data as possible is presented based on insufficient sample size. Even our most backward clients can tell that they need more than a handful of respondents to come to any meaningful conclusion about a much larger group. So, you must see to it sample size is kept small, and more importantly, unclear. That way you clients might assume the opinions of say, 10 customers, actually represent 1000 or more. This exercise is particularly satisfying when it comes to workplace “evaluations.” Merely mislead management to believe the 5 complaints they received out of 10 responses out of 1000 possible really means their hire has a 50% approval rating. The flip side of this record is also worth playing. Sometimes beleaguered employees will resort to planting positive responses; making it look like their ratings are higher ~ meanwhile no definite conclusion can be reached ~ of course ~ due to inadequate sample size!
Another twist on this idea is to keep changing the sample until you get what you want. Do you think 4 out of 5 dentists actually agree on a certain product, or did the advertiser just keep asking the same question until they found 4 out of 5 who happened to agree? Would you believe 5 out of 5 tempters agree with Our Executive? Those who do not agree, no longer exist!
Third, if you’re going to coax your clients to re-shuffle the sample, it’s even more fun to induce them to cherry-pick the data as well. Much deception has taken place whenever data our presenter-clients don’t like is simply not reported, and only data supporting their conclusions or cause appears in their reports. By the rules of statistics, numbers must be ruled out for particular reasons ~ you need to make sure these reasons remain obscure. It’s best to report only the results that confirm what you’re trying to show in the first place. The idea is to appear unbiased while hiding data that might cast doubt on your conclusions. If you make the right moves, your clients won’t realize they’ve made an important decision based on incomplete or biased information until it’s too late and the damage has already been done.
Another angle to play is to encourage your clients to search for data to support their decisions after they have been made! That way cherry-picked data can lend credence to whatever they have already decided to do, making the process look like it was “data-driven” even though it was not. As cunning as this technique is, imagine how devastating it can be to an organization when the decision-makers are discovered. It’s a shame we haven’t tipped off more clients. The resulting anger and frustration can be very useful to us.
Fourth, you can easily confuse your clients with the notion that correlation is the same as causation. All you have to do is find two statistical trends that happen to agree and imply that because they are correlated, one thing therefore must therefore “cause” the other. There are plenty of examples that prove this assumption false, but don’t let your clients think too much about facts. What matters most is the appearance that one thing reduces or increases the chances of another, not that it actually does so. Your aim is get your clients to reach a particular conclusion based on hazy logic. After all, a decision based on what is presumed to be hard data is thought to be better than one based on pure conjecture. Just make sure the data is no better than guesswork and your clients will think they’ve made an informed choice, when in reality they’ve just used faulty statistics to confirm their personal biases.
Finally, and most subtly, “the devil is in the details,” as many humans have said. Our Executive would be the first to own this statement. The answer to any question depends on how you ask it, or in our case, how we get our clients to ask it. “Do you believe the government should waste money helping those who are not working?” is a very different question than, “Do you believe children of the unemployed should receive government assistance so they will not starve?” Both questions involve spending money on the unemployed, but are designed to elicit completely different responses. The general idea is for our clients to ask questions in a way that evokes the response they want, making their point seem stronger, while avoiding any realization that they’ve done so. It also helps to make sure the kind of question you want them to ask is only asked of the demographic type you want to answer it. We can tempt our presenter-clients to draw whatever picture of the data we want them to draw if we urge them to combine controlled samples with loaded questions. The result is many observer-clients will believe what they see, because after all, numbers don’t lie. No kidding! We lie. Humans lie. And with surprisingly little guidance from us, humans can be taught to lie with numbers!
Your Devoted Cousin,
Bloodstone
