
‘Juice jacking’ plus music gift cards
Read Time: 6 mins
Written By:
Robert E. Holtfreter, Ph.D., CFE
This case is fictional, but it represents a real scam. According to the U.S. Federal Trade Commission (FTC), fraudsters are again using email and text messages to impersonate the IRS — complete with the IRS logo — to steal personally identifiable information (PII) and use it for nefarious purposes.
The fraudulent message says you have a refund of $650 because you overpaid $1,000 on your 2023 tax return. “We applied for all or part of your … overpayment … to the amount you owe for other tax years. As a result, you are due a refund of $650.00.” You’re then instructed to click on the link, “Check Your Refund,” to check on your “tax refund e-statement” or fill out a document. Don’t do it because the fraudster will steal your identity or load malware onto your computer or smartphone. In either case, your PII will be compromised.
“If someone contacts you unexpectedly about a tax refund, the most important thing to know is that the real IRS won’t contact you by email, text message, or social media to get your personal or financial information,” the FTC advises. “Only scammers will.”
The FTC recommends:
If you clicked on a link in one of these messages, or you shared personal or financial information, report it at IdentityTheft.gov to get a free customized recovery plan. (See “The IRS doesn’t send tax refunds by email or text,” by Larissa Bungo, FTC, Jan. 23, 2024.)
Fraudsters, impersonating fake government agencies, are mailing forms and letters to small business owners demanding immediate payment to register or renew business licenses or trademarks, according to the FTC.
If someone contacts you unexpectedly about a tax refund, the most important thing to know is that the real IRS won’t contact you by email, text message, or social media to get your personal or financial information, the FTC advises. Only scammers will.
The scammers try to add legitimacy to the fake government letters by including words like United States, business regulation and trademark. The potential victim is advised to visit a website where they’ll then be asked to provide a business license, Social Security number, Employer Identification Number and a credit card number. In typical fraudster fashion, the letter warns the recipient that they’ll be fined if they don’t respond quickly.
Before responding to these letters:
If you spot a scam like this, tell the FTC at ReportFraud.ftc.gov. Also report the scam to your local law enforcement agency and media outlets. (See “Government impersonators mail fake notices to business owners,” by Bridget Small, FTC, Feb. 13, 2024.)
Many organizations believe ChatGPT and other generative AI platforms will improve employee productivity and product innovation, which will lead to increased profits.
One common feature they’re adding to platforms, especially new ones, is the ability to upload files that might include massive amounts of information. However, of course, if that data is in insecure systems, it’s vulnerable to data breaches and loss of PII and important sensitive business information.
Menlo Security addressed these issues after it analyzed AI interactions from a sample size of 500 global organizations and released its findings in the June 2023, study, “The continued impact of generative AI on security posture.”
The study shows that the types of data that employees are injecting into new generative AI platforms include PII (55.11%), confidential information (39.86%), restricted information (3.44%), medical information (.9%), payment card industry (.03%) and other information (.67%). The high PII exposure has serious implications for increases in identity theft, which is already at record levels.
The study also showed that organizations insert data into AI platforms by typing it in, copying and pasting, or via file uploads. Copying and pasting, and file uploads pose the biggest risk to data loss because they normally include enormous amounts of data. Data loss through file uploads is increasing.
The Menlo Security study provided steps to help minimize the loss of PII and other sensitive information when using generative AI platforms.
Please use this information in your outreach programs and among your family members, friends and co-workers.
As part of my outreach program, please contact me if you have any questions on identity theft or cyber-related issues that you need help with or if you’d like me to research a scam and possibly include details in future columns or as feature articles.
I don’t have all the answers, but I’ll do my best to help. I might not get back to you immediately, but I’ll reply. Stay tuned!
Robert E. Holtfreter, Ph.D., CFE, is a distinguished professor of accounting and research at Central Washington University. He’s a member of the Accounting Council for the Gerson Lehrman Group, a research consulting organization, and a member of the White Collar Crime Research Consortium Advisory Council. He’s also the vice president of the ACFE’s Pacific Northwest Chapter and serves on the ACFE Advisory Council and the Editorial Advisory Committee, and he serves on the ACFE’s inaugural CFE Exam Content Development Committee. He received the ACFE’s “Outstanding Contribution to Accounting” award in 2005 and the “Instructor of the Year” award in 2006. Holtfreter was the recipient of the Hubbard Award for the best Fraud Magazine feature article in 2016. Contact him at doctorh007@gmail.com.
Unlock full access to Fraud Magazine and explore in-depth articles on the latest trends in fraud prevention and detection.
Read Time: 6 mins
Written By:
Robert E. Holtfreter, Ph.D., CFE
Read Time: 6 mins
Written By:
Robert E. Holtfreter, Ph.D., CFE
Read Time: 5 mins
Written By:
Robert E. Holtfreter, Ph.D., CFE
Read Time: 6 mins
Written By:
Robert E. Holtfreter, Ph.D., CFE
Read Time: 6 mins
Written By:
Robert E. Holtfreter, Ph.D., CFE
Read Time: 5 mins
Written By:
Robert E. Holtfreter, Ph.D., CFE