
When 26-year-old Duncan Okindo left Kenya for what he thought was a customer service job in Thailand in 2024, he expected a new beginning. Instead, he found himself trapped in a scam compound along the lawless Myanmar–Thai border. What he witnessed there revealed a growing and disturbing trend: the use of artificial intelligence tools like ChatGPT to power massive online fraud schemes.
Okindo says he spent four months confined inside a heavily guarded complex called KK Park, “fortified like it was meant for war,” and run by armed men. The facility, he explained, resembled other compounds in the region that Chinese-led criminal groups largely manage. These operations are designed for one purpose: running coordinated online scams targeting people worldwide.
Inside, hundreds of forced laborers sat in rows, logged into desktop computers. Many, according to Okindo, used free versions of ChatGPT to create convincing messages that tricked unsuspecting victims, mostly Americans, into investing in fake cryptocurrency ventures. This type of scam, known as “pig-butchering,” relies on building a relationship with a victim before ultimately stealing their money.
How do scammers use ChatGPT to manipulate victims?
Okindo’s primary task was to pose as a wealthy investor targeting U.S. real estate agents. He would browse property websites such as Zillow.com, searching for agents to contact. Pretending to be a rancher from Texas or a soybean producer from Alabama, he reached out to dozens of agents each week, working under strict quotas. “You need to feel familiar,” he said. “If you miss any point, the realtor will know that you are a scam.”
Each worker was required to convince at least two agents daily to deposit funds into fake investment accounts, while chatting with at least ten “clients” at a time. ChatGPT helped them craft realistic dialogue, mimic American slang, and maintain the illusion of authenticity. When targets asked about cryptocurrency or local property markets, Okindo quickly copied their questions into ChatGPT and generated polished, believable answers within seconds.
The scammers’ scripts were tightly controlled, outlining every step of a week-long conversation. By the third or fourth day, they would introduce cryptocurrency as an “investment opportunity,” directing victims to fake trading platforms controlled by the criminal group. Once victims transferred real money into these accounts, it vanished instantly into digital wallets that were nearly impossible to trace.
According to Okindo, ChatGPT was “the most-used AI tool to help scammers do their thing.” It allowed them to adapt to different personalities, improvise responses, and maintain consistent, fluent English. This made it harder for victims to suspect anything unusual, even in long conversations.
What happens inside these AI-driven scam compounds?
Failure to meet targets came with brutal consequences. Okindo described seeing fellow workers humiliated, beaten, or even shocked with electric batons by their superiors. When someone succeeded in tricking a large victim, the bosses celebrated by ordering workers to beat drums in the compound. “My dignity was reduced to ashes,” he said.
The entire setup ran like a digital factory of deception, with each person trained to specialize in different scam types, from fake investments to romance cons. Okindo eventually escaped in April after Thai authorities cut off power to several scam compounds, forcing the release of some captives.
A representative from HAART Kenya, an anti-trafficking group that helped rescue him, confirmed Okindo’s story aligns with reports from several other Kenyans freed earlier this year. While Reuters could not independently verify every detail, his testimony matches those of other victims forced to run online scams in Southeast Asia.
OpenAI, the company behind ChatGPT, told Reuters that it “actively works to identify and disrupt scam-related misuse of ChatGPT.” It added that its systems are designed to reject fraudulent requests and that violators are removed once detected. Zillow, one of the sites where scammers searched for victims, declined to comment.
Other victims share similar experiences. Two Burmese men told Reuters that they, too, were forced to use ChatGPT in scam operations. One of them, who ran romance scams, said the release of the chatbot in 2022 changed everything. He could now charm multiple victims at once, sending AI-generated poems and flirty messages that sounded authentic. “The bot’s persuasive writing lent our words a credibility that made the victims trust more in us,” he said.
Okindo has since returned to Kenya but still faces fear and stigma. He says he has received threatening calls from individuals he believes are connected to the cartels. Despite the trauma, his account serves as a crucial warning: as artificial intelligence becomes more accessible, criminals are finding new ways to weaponize technology for fraud.
By Lucky Anyanje



