Sam was the ideal employee at the manufacturing plant. He worked overtime, needed minimal supervision, was congenial, and often did favors for his colleagues. But behind the façade, he and two others pocketed more than $100,000 from a plant fund in a 10-year period. Sam’s crime could have been caught much earlier if a fraud examiner had used both a mathematical theorem called Benford’s Law and a software program found on most personal computers today.
In the April/May 1994 issue of The White Paper, Mark J. Nigrini, Ph.D., presented a novel analytical method of detecting numerical anomalies that could indicate fraudulent activity. Using Benford’s Law, the method involves finding a pattern in the frequency of digits in a list of figures, beginning from the far-left digit in a figure. As he wrote in the article, digital frequencies refer to the proportion of numbers that have a 1, 2, ¦ 9 as a first digit, and the proportion of numbers that have a 0, 1, ¦ 9 as a second, third, and so forth digit.
The Failure of Common Sense
What are the chances that the first digit of a multi-digit number in a list or table will be nine? Many may assume that the odds are one in nine. Intuitively, we may believe that numbers in large lists are composed of random digits, each having an equal chance of appearance. Unfortunately, sometimes common sense fails us.
During the 1930s, a physicist named Dr. Frank Benford examined the frequency of certain numbers appearing as initial digits in lists of "natural" numbers. Benford divided all the information into two categories “ "natural" and "non-natural" numbers. Natural numbers are those numbers that are not ordered in a particular numbering scheme and are not generated from a random number system. For example, most accounts payable files will be populated by dollar values that are natural numbers. On the other hand, Social Security numbers and telephone numbers “ non-natural numbers “ are designed systematically to convey information that restricts the natural nature of the number.
Without the aid of a computer, Benford examined first-digit frequencies of 20 lists covering 20,299 observations of natural numbers. His lists covered data such as street numbers of scientists listed in an edition of American Men of Science, the numbers contained in the articles of one issue of Reader’s Digest, and such natural phenomena as the surface areas of lakes and molecular weights. Benford discovered that the distribution of the initial digits in natural numbers is not random but rather follows a predictable pattern, which is now known by his name. Benford derived a formula to predict the appearance of the initial digit in any table of natural numbers. The expected occurrence for the first digit is:
Probability (x is the first digit) = Log10 (x+1) “ Log10(x)