Kenyan gig worker exposes hidden labor of AI companion bots

NEW YORK, UNITED STATES — A Kenyan gig worker has lifted the veil on the hidden human labor powering artificial intelligence (AI) “companion bots,” exposing the emotional and financial toll of an industry often portrayed as fully automated.
Michael Geoffrey Asia, a former aviation graduate, shared his story with the Data Worker’s Inquiry, an international research initiative documenting working conditions in the digital economy.
Inside Kenya’s hidden AI companion workforce
Asia, who lives in Nairobi’s Mathare slums, took a job as a “text chat operator” for the Australian firm New Media Services to support his family. But the role quickly revealed itself to be far from ordinary.
“What I didn’t know was that the role would require me to assume multiple fabricated identities, and use pseudo profiles created by the company to engage in intimate and explicit conversations with lonely men and women,” Asia wrote.
In his day-to-day work, he juggled “three to five different personas” at once, switching genders and maintaining detailed backstories to sustain conversations that could span days.
He was paid a flat rate of $0.05 per message, had to meet strict character counts, type at least 40 words per minute, and monitor a dashboard tracking his output.
“Falling behind on metrics could lead to warnings, reduced assignments, or termination,” Asia explained, highlighting the intense pressure of the role.
Emotional toll of impersonating AI companions
The work took a heavy emotional toll. Chat users, believing they were interacting with AI, shared intimate details and personal trauma.
“My faith taught me that love should be real, intimacy sacred, and that deception was destructive to both the liar and the deceived,” Asia said.
“Yet here I was, professionally deceiving vulnerable people who were genuinely looking for connection — taking their money, their trust, their hope, and giving them nothing real in return,” Asia added.
Asia also had to hide his work from his family, claiming he was a remote IT worker.
“How do you explain that you get paid to tell strangers you love them while your real family sleeps three meters away?” he wrote, underscoring the secrecy and isolation built into such roles.
This story reflects a broader pattern in the outsourcing industry, where high-stress, low-pay digital labor is often concentrated in underdeveloped regions. While technology companies market AI as autonomous and emotionally neutral, much of the work is human-powered and exploitative.
As Asia’s testimony reveals, the industry thrives on a hidden workforce whose emotional labor sustains the illusion of AI, raising pressing questions about ethics, transparency, and global labor practices.

Independent




