Black and red portrait silhouette
Research Findings

Emotions speak louder than words

By Akers C. Kuo, Ph.D., CFE
Please sign in to save this to your favorites.

Online investment scams are on the rise, and they’re carefully crafted performances that use human psychology to lure victims. Here, the author discusses his research on the emotional tricks that fraudsters use and how understanding these tactics can help stop such crimes in their tracks.

A message from a stranger popped up in my chat app. It was from an attractive woman claiming to be with a reputable Taiwanese investment firm. She had “insider” stock picks for me. I knew it was a scam because I’d seen this before. As someone who works in fraud prevention, I get these messages all the time. Usually, I ignore them. But I decided to follow the conversation and see where it led. I never sent money and never clicked links. But I stayed just long enough to watch the entire play unfold. What I found was both fascinating and frustrating.

For several days, the woman sent buy-and-sell recommendations with precise entry and exit prices. Each time, the stocks moved just as she’d predicted. I didn’t invest money, but I told her I did. 

Soon, I was invited into a private investment chatroom. It felt alive with active discussions, market insights and casual banter. Then “Mr. Golden” appeared. He introduced himself as an investment instructor and invited me to join his regular webinars. I joined several. They were surprisingly sophisticated. Eventually, he pitched a VIP membership: $5,000 for a 20% monthly return or $15,000 for 35%. It was too good to be true, and I knew it.

Mr. Golden connected me with another “adviser” who laid out the process. I’d be investing through a trading platform called MT5 (Meta Trader 5) by opening an account with a foreign broker. One search revealed that the broker had long been suspended. This method of involving MT5 and a suspended foreign broker is one of many classic scams increasingly common across Asia these days.

Investment Scam Life Cycle

The format is familiar: a friendly message, slow and steady trust-building and then a shift into group chats where fake investors share success stories and talk up a particular opportunity. Eventually, victims are directed to invest through obscure overseas platforms. 

What makes these scams especially hard to stop is how easily they cross borders. A victim in Taipei might wire money to an account in Hong Kong, through a brokerage registered overseas, managed by someone operating out of Southeast Asia. By the time anyone realizes something’s wrong, the money’s long gone, and no single country has full visibility of the process.

This was just one of several chatrooms I researched. Each followed a remarkably similar script. They were polished, emotionally persuasive and timed like clockwork. But as a fraud researcher, I saw an opportunity. Could we build a detection model based not on what scammers say, but on how they say it? In other words, can emotional manipulation be a detection signal for fraud?

Research findings man on phone

The format is familiar: a friendly message, slow and steady trust-building and then a shift into group chats where fake investors share success stories and talk up a particular opportunity. Eventually, victims are directed to invest through obscure overseas platforms.
Akers C. Kuo, Ph.D., CFE

From emotion to analytics

Scammers don’t just lie — they perform. From the moment they lure you in, their language is loaded with emotional cues. After encountering scams myself and seeing similar cases in the news, I decided to conduct an independent study analyzing over 29,000 messages from seven scam chatrooms in Taiwan. The study examined how scammers’ emotional fluctuations shifted across what I call the Investment Scam Life Cycle (ISLC), which comprises five stages: Luring, Alluring, Catching, Executing and Vanishing.

  1. In the Luring phase, scammers quietly gather targets, often by posing as trusted institutions and using profile pictures of attractive women. Their goal is to create a credible hook — one that blends romance and finance.
  2. In the Alluring phase, they initiate contact and share “free” stock tips. These tips seem to perform well, building credibility. Language during this stage is emotionally warm, filled with words like “opportunity,” “trust” and “benefit.”
  3. In the Catching phase, the victim is invited into a group chat that feels dynamic and legitimate. Scammers — sometimes aided by bots or accomplices — establish proof that their investment strategy works. Members echo success stories. Emotionally, this stage is the most intense and is designed to create a sense of excitement, urgency and fear of missing out (FOMO).
  4. Now the trap is sprung. Victims are directed to unfamiliar platforms or shady brokers in the Executing phase. The tone remains supportive, but pressure mounts. There are deadlines, technicalities and “account managers” offering to walk you through it all.
  5. After the funds are transferred, communication slows. The platform malfunctions. The advisers disappear. The chatroom goes quiet. Victims are left alone — and broke. That’s the Vanishing phase.

From these real-life observations, I developed a machine-learning-based fraud detection model focused not on financial transactions, but on emotional fluctuations. Why emotions? Because in investment scams, fraud doesn’t begin with a demand for money — it begins with trust. And trust is built emotionally, not logically. 

Most conventional fraud detection systems rely on transactional red flags or binary emotion classification, where they flag “negative” sentiment as risky and “positive” sentiment as safe. But that approach misses something critical: modern scammers don’t threaten; they reassure. Their emotional language is overwhelmingly positive. They use words and phrases like “opportunity,” “confidence,” “happy for you” and “you deserve it.”

In the study, I found that more than 60% of scam messages were classified as positive. That was particularly true during the most dangerous stage of the scam, when the victim is being pressured to invest.

But I didn’t stop at just labeling text as “positive” or “negative.” Instead, I used a Mandarin-language emotion dictionary that captured seven distinct emotional states: good, happy, expectation, surprise, fear, anger and sadness. This is a more nuanced map of how scammers shift emotional fluctuation across the scam life cycle.

That level of detail could provide fraud examiners with some unexpected advantages. First, it helped track how perpetrators change emotional gears. They don’t suddenly become aggressive. Instead, they stay upbeat, but the tone shifts from gentle encouragement to subtle pressure. Language that once sounded optimistic starts to feel more urgent but remains friendly. It’s manipulation, just harder to spot.

Second, I noticed what wasn’t there. Emotions like fear or sadness barely showed up at all. And that became a red flag. When every message sounds too cheerful and polished, it starts to feel unnatural. That kind of unnatural emotional response turned out to be just as telling as what was actually said. 

I took those emotional signals — not just what was expressed, but when and how — and trained machine learning models to see if they could tell scam messages from real ones. The model detected scams with 86.5% accuracy, all without needing financial terms or account details. It worked just by reading the rhythm of the conversation.

Research findings

The implication is powerful: we can potentially detect fraud before the fraudster asks for money, simply by recognizing how they emotionally prepare their victims. One way to prevent someone from becoming a victim of this type of investment fraud could be as simple as a behavioral alert system integrated into messaging apps — not based on keywords, but on unusual emotional consistency. If a user receives daily messages with overly positive sentiment and financial encouragement from a new contact, the system could gently prompt: “This conversation matches known patterns used in investment fraud. Please proceed with caution.”

This kind of real-time feedback doesn’t accuse anyone of wrongdoing. It merely breaks the rhythm. And that alone can be enough to stop a scam before it escalates.

The cast behind the chat

One thing I found striking while going through this experience — and later when I reviewed other cases — was how deliberate the roles were. Nothing felt random. It was like watching a play, with each character stepping in right on cue.

First came the Fronter — friendly, patient and always polite. She said she worked for a well-known investment firm and offered stock tips “just for reference.” There was no pressure, just small talk, charts and a smiley emoji here and there. The Fronter leads with friendliness and consistency. She never pushes. She just reminds, encourages and follows up.

Then came the Closer, who showed up in the group chat. This person had the credibility. They sounded like an expert, and maybe even a bit inspiring. In my case, he was called “Mr. Golden,” and he ran what looked like a real online class. He talked about returns, strategies and memberships. He was the one who made the offer feel too good to pass up. The Closer often appears in the group setting and creates a sense of community, momentum and expert endorsement.

And finally, there was the Verifier — the one who handled the “process.” He explained how to register on the platform, open an account with a foreign broker and make the wire transfer. There was no sales talk, just step-by-step instructions like a helpful support agent might provide. The Verifier offers legitimacy. He handles logistics with precision, reducing any friction that could derail the process

Looking back, I realized the scammers weren’t just improvising. They were following a script. And if you’ve ever worked on an investment scam case, I’m sure these roles sound familiar. You may not have called them by name, but you’ve seen them before.

Research findings woman on phone

Why they play the long game

These scams aren’t just random messages sent out to see who bites. There’s a structure to them. They plan who talks to you first, what they say and when they hand you off to someone else. You can tell they’ve done this before. They know what works. And when something doesn’t work, they adjust.

Each handoff in this process is designed to eliminate any doubt the victim might have and ensure that the scam moves forward. In the same way a con artist manipulates his target, these scammers manage not just the victim’s money, but their emotional bandwidth.

By understanding this process, fraud fighters can build models to interrupt the narrative before the victim is fully invested emotionally. The rational scammer, when confronted with resistance earlier in the process, may choose to move on to an easier target. And in this game, making the scheme less convenient can sometimes be more powerful than making it more punishable.

Lessons learned

Here’s what really stood out to me after digging into these scam conversations and building the model. First, emotions matter more than we think. Scammers don’t try to scare you off. They’re warm, polite and upbeat. That’s the trick. That friendliness lowers your guard. In my analysis, more than 60% of their messages carried “good,” “happy,” or “positive” tones. It’s not random. They’re doing it on purpose, because it works.

Second, they’re not improvising — they’re following a script. It’s the same story over and over: a friendly hello, a tip or two, a chat group full of “happy investors” and then the big pitch. They’ve got it down to a routine. That means we can learn the rhythm too — and step in early, before someone gets pulled all the way in.

Third, modeling their behavior works. I fed those emotional shifts into a machine learning model — no transactions, no account information, just language and timing. The model was able to pick out scam messages with great accuracy. That’s huge. It means we don’t have to wait for someone to lose money to know something’s wrong.

Fourth, they’re strategic — not chaotic. These scammers put in real effort. They build fake profiles, keep conversations going for days, even hold fake online seminars. Why? Because the payoff is worth it to them. If we make it harder by spotting them earlier and making their job harder, they’ll move on. It’s all about shifting the balance of effort. Finally, most fraud systems still wait for the transaction. But the scam starts before that in the tone of the chat, the way trust is built and the way language changes over time. If we can flag that with alert systems integrated into messaging apps, we’re not just reacting anymore. We’re getting ahead of it. 

Fraud isn’t just in the numbers. It’s in the emotions. The sooner we learn to detect those signals, the sooner we stop the show.

Research findings


This column has been adapted for practitioners from the author’s peer-reviewed article, “Constructing an Investment Scam Detection Model Based on Emotional Fluctuations Throughout the Investment Scam Life Cycle,” published in the journal Deviant Behavior in 2024.

Akers C. Kuo, Ph.D., CFE, is the section chief auditor at the Far Eastern Group in Taiwan. He also serves on the board of directors and as a lecturer for the ACFE Taiwan Chapter. In 2022, he was a finalist for the 31st Outstanding Internal Auditor Award by the Institute of Internal Auditors-Chinese Taiwan (IIA-Taiwan). Contact him at summers0309@gmail.com.

Begin Your Free 30-Day Trial

Unlock full access to Fraud Magazine and explore in-depth articles on the latest trends in fraud prevention and detection.