
Why do they do it?
Read Time: 16 mins
Written By:
Dick Carozza, CFE
People inside and outside an organization hit by fraud often ask themselves how it was even possible that a huge fraud scheme was able to flourish unnoticed. We might think that all those involved in the scheme must be ruthless, greedy and outright bad. However, we can find additional possible incentives than greed for fraudsters to commit crimes.
Our surroundings often influence our unconscious behaviors. Maybe you’ve had this experience at home. If your apartment or house is clean, you’re more willing to invest time and effort to keep it that way than when it’s been a mess for a while. For example, let’s say your domicile is spotless. You then prepare a meal, and the oven gets dirty. You feel the urge to clean up the oven immediately. Why is that?
Now, maybe the others you might be living with don’t share the same cleanliness standards. They leave their socks on the floor and don’t clean up their dirty dishes. And so, your home progressively (or regressively) changes from super clean to a little bit unkempt to certifiably messy. The next time the oven is dirty, you don’t seem to have the inner need to clean up the oven immediately. Maybe you’ve even lowered your standards of what you perceive as clean.
Robert Cialdini, a well-known U.S. psychologist, has conducted many studies on how surroundings influence human behavior. He and his colleagues conducted a two-part experiment during daylight hours in a hospital’s multilevel parking garage that was either extremely messy — with cigarette butts, bubble gum and plastics lying on the floor — or immaculately clean. They placed fliers with an irrelevant text under the windshield wipers of the cars parked in the garage.
The experiment’s subjects were 139 visitors to the hospital. When the visitors would leave the hospital after their visits and enter the parking garage, 14% would throw the fliers under their windshield wipers on the floor of the clean garage as compared to 32% in the dirty garage.
Cialdini and his team went one step further. They hired an accomplice who’d hold the same flyer in his hands as the car owners entered the dirty garage and would toss it to the floor in front of the car owners. The percentage of people who followed his lead and threw the flyer to the ground rose to 54%. (See A Focus Theory of Normative Conduct: Recycling the Concept of Norms to Reduce Littering in Public Places, by Robert B. Cialdini, Raymond R. Reno and Carl A. Kallgren, Journal of Personality and Social Psychology, Volume 58, No. 6, 1990, pages 1015-1026.)
So, what do these observations imply about corporations and their cultures? First, people tend to behave as those around them. They might think, “Oh, there’s trash all over the floor. This trash was thrown away by other people. It might be okay to throw trash on the floor around here.” Second, if people see behavior firsthand, which doesn’t follow a general prescriptive rule like “Don’t throw your trash on the floor,” they might adapt the behavior they observe from others instead of sticking to the prescriptive rule.
In both of the two above scenarios, the environment makes it easier for the car owners to rationalize their bad behavior: “Everyone litters, so why shouldn’t I?!” or “This one flyer here is nothing compared to all the other dirty stuff lying around here.”
An organization’s prescriptive rules include its code of conduct, guidelines, instructions, process charts, visions etc. However, these rules appear less binding if some employees — especially management leaders — disregard them openly. (It’s the basic tone-at-the top scenario.) In the experiment, the car owners followed the lead of strangers. So, what impact does a trusted leader’s deviant behavior have on employees when they see it firsthand?
Many successful fraud schemes require the collaboration of several skilled individuals to avoid internal controls or disclosure. We increasingly see this because more companies are facing stricter regulations, and so they’re investing significant amounts of money in their internal control environments. But how can there be so many immoral people out there who break the rules unnoticed?
In reality, not all of these people need to be immoral to become fraudsters. Even people with apparently high ethical standards and levels of conscientiousness can demonstrate situationally deviant behavior. Why does this happen?
The well-known psychologist Stanley Milgram showed in his experiments how far ordinary people would go to follow instructions, which he called “destructive obedience.” In an infamous study, he and his team, according to the abstract, ordered a willing naïve subject to ask a victim questions and administered increasingly severe electric shocks when the victim answered incorrectly. However, the “victim” was a willing accomplice who didn’t feel a thing.
The accomplice, with each wrong answer, would feign discomfort, then whimper, and finally scream and plea for mercy to stop the experiment. When subjects showed their concerns, the experiment director would encourage them to continue.
Psychologists at that time estimated that only 1% to 2% of subjects would follow the instructions, and they’d be mentally ill. In fact, 65% of the subjects gave a potentially deadly shock to the accomplice. (See Behavioral Study of Obedience, by Stanley Milgram, Journal of Abnormal Psychology, November 1963, pages 371-378.)
Subjects in Milgram’s experiments, who probably possessed high levels of the conscientiousness personality trait, possibly would’ve said that they’d learned to follow the rules set by the experiment director — someone who was higher in the hierarchy — because they would’ve wanted to fulfill expectations and properly perform for the person in charge.
An organization’s prescriptive rules include its code of conduct, guidelines, instructions, process charts, visions etc. However, these rules appear less binding if some employees — especially management leaders — disregard them openly.
Employees who’ve signed work contracts with organizations might also feel obligated to explicitly obey their managers’ questionable demands. Crooked managers could take advantage of newbies, who might not yet understand “how people work around here,” to trick them into fraud schemes. Ethically challenged employees could rationalize obeying their managers (“diffused responsibility”) and neutralize their consciences by thinking, “If they give me this task, they are responsible for the outcome.”
An example would be if a manager tells an employee to “Just change these two numbers in the budget presentation for upper management.” The employee will think, “To get what we want, we don’t need to be entirely honest.”
Workers don’t need to have mental disorders to behave deviantly. Sometimes they just follow what they perceive a leader or a group of managers would expect from them.
Top executives can also fall prey to persuasive CEOs. Aaron Beam, the first CFO and cofounder of HealthSouth, says he succumbed to committing fraud by overstating earnings to please his boss, CEO Richard Scrushy, and Wall Street. Beam said Scrushy — a charismatic, persuasive leader — convinced him to change the numbers for four quarters, though he didn’t want to do it. He couldn’t stand the pressure, so he quit. Years later, HealthSouth was caught, and Beam was indicted and convicted. He served three months in jail for his reluctant fraud. “I’m [now] known as a felon, a fraudster,” he says. “I embarrassed my family; my daughter and my wife went through a lot of pain, but they stayed with me. …” (See: What Happened in Vegas … Fraud Magazine, September/October 2009.)
Social psychologist Solomon Asch impressively showed how group expectations influence behavior in his 1955 conformity studies. (See Opinions and Social Pressure, by Solomon E. Asch, Scientific American, November 1955.)
In his experiments, Asch and his team presented to a group of seven to nine young men, all college students, two large white cards. On one card is a single vertical black line. On the other card are three vertical lines of various lengths. The participants now judge which of the three black lines are the same length as the reference line on the first card.
All the participants were accomplices on Asch’s team except for one real participant who’d make their judgment after all other group members had openly communicated their decisions.
The accomplices would give correct answers for the first few rounds, but then they’d choose an obviously wrong line. Thirty-seven percent of the real participants followed the lead of the fake participants and chose the wrong line in every round when the fakers would choose wrong lines.
In this context, the terminus social pressure doesn’t completely apply because the people in the experiment just adapted their behavior toward the group’s behavior. This isn’t surprising; most of us don’t want to be alone in our decisions or differ from a group from which we receive trust and cohesion. As humans have evolved, we’ve found it’s often useful to just follow others for our survival.
Corporate cultures often don’t welcome and reward speaking up. Employees sometimes don’t question their supervisors’ instructions, especially when their bosses are domineering and threatening. And employees are afraid of not fitting in and losing the benefits of the social in-group. Management can even trick confident, high-esteem workers.
Employees might be embarrassed to talk about these issues with others. This non-shareable information festers over time and further isolates originally innocent people from their non-fraudulent peers.
Fraud examiners should encourage enlightened management to conduct integrity training sessions in which employees and managers are confronted with workplace dilemmas.
Organizations would do well to encourage an “open-error culture,” in which employees and management can discuss their mistakes freely. This can help prevent managers or influential employees from trapping employees in fraud schemes and can also stunt fraud breeding.
Don’t underestimate the power of group dynamics to lead compliant employees to commit fraud. Quarantine fraud viruses by encouraging honest managers and routing fraudulent executives.
Benjamin Schorn, CFE, is a forensic investigator at KPMG and self-employed profiling specialist, systemic coach and behavior analyst. Contact him at bschorn@kpmg.com and benjaminschorn.de.
Unlock full access to Fraud Magazine and explore in-depth articles on the latest trends in fraud prevention and detection.
Read Time: 16 mins
Written By:
Dick Carozza, CFE
Read Time: 18 mins
Written By:
David L. Cotton
Sandra Johnigan
Leslye Givarz
Read Time: 5 mins
Written By:
Annette Simmons-Brown, CFE
Read Time: 16 mins
Written By:
Dick Carozza, CFE
Read Time: 18 mins
Written By:
David L. Cotton
Sandra Johnigan
Leslye Givarz
Read Time: 5 mins
Written By:
Annette Simmons-Brown, CFE