Jacobus Louw is 27 years old and lives in Cape Town. Last month he earned $14 filming himself walking through his neighbourhood for an app called Kled AI. The task was called 'Urban Navigation'. The payment was ten times South Africa's minimum wage. It bought half a week's groceries.
TLDR
AI companies facing a data drought are paying workers in developing countries small sums to film their daily lives, record their phone calls, and clone their voices. The payments range from $0.02 per minute for voice cloning to $14 for filming a walk. Users sign away irrevocable, royalty-free rights to their biometric data, with little recourse if companies repurpose it. One platform already exposed thousands of recordings through a security breach before going offline entirely.
KEY TAKEAWAYS
That same month, Sahil Tigga, a 22-year-old in Ranchi, India, made over $100 letting an app called Silencio access his phone microphone to record ambient noise. He travels to capture unique settings: hotel lobbies, railway stations, places that sound different from his neighbourhood. In Chicago, an 18-year-old named Ramelio Hill sold recordings of private phone conversations with his family to Neon Mobile at $0.50 per minute. He made $200 for 11 hours of calls with his parents and siblings.
His reasoning was straightforward: 'Tech companies already capture my data. Might as well get paid.'
The data drought and where it leads
Here is the context that makes these payments possible. Researchers estimate that AI companies will run out of fresh, high-quality text data to train their models by 2026. A quarter of the highest-quality datasets have already restricted AI companies from scraping them. When AI systems are trained on AI-generated content instead of human data, the models degrade. Researchers call this 'model collapse'.
So the companies need human data. Real conversations. Real voices. Real faces. Real footage of real people walking through real streets. And they have found a way to get it cheaply.
The platforms have names that sound like startups: Kled AI pays for videos and photos of everyday life. Silencio crowdsources ambient audio and voice recordings. Luel AI, backed by Y-Combinator, pays around $0.15 per minute for multilingual conversations. ElevenLabs will clone your voice for a base fee of $0.02 per minute. Neon Mobile paid $0.50 per minute for recordings of your actual phone calls.
I say 'paid' because Neon Mobile no longer exists. In September 2025, TechCrunch discovered a security flaw that exposed users' phone numbers, recordings, and transcripts. The company went offline shortly after. The people who sold their family conversations for $0.50 a minute have no way to retrieve that data. They have no way to know who has it now.
What you sign when you sign up
I spent four years at Choice reading product disclosure statements, and I have never seen terms quite like these. Users grant 'irrevocable, royalty-free' licences that allow these companies to create 'derivative works'. In plain English: once you upload footage of your face, your voice, or your conversations, you cannot take it back. The company can use it however they want, forever, and they do not have to pay you again.
Consumers have little recourse if data is repurposed.
— Professor Jennifer King, Stanford University
Professor King studies privacy and data governance. She is being polite. 'Little recourse' means no recourse. There is no mechanism to track where your biometric data goes. There is no regulatory body you can complain to. There is no compensation if the company sells your face to someone who uses it for something you find objectionable.
Ask Adam Coy. He is an actor who sold his likeness to a company called Captions for $1,000. His AI-generated face now appears in viral videos promoting unproven medical supplements to pregnant women. He did not consent to that specific use. It does not matter. The licence he signed was irrevocable.
Who wins and who loses
Professor Mark Graham at Oxford studies digital labour in developing economies. He describes this work as 'precarious, non-progressive, dead end'. The phrase that matters is this one: platforms in wealthy countries capture all the enduring value.
Think about what Jacobus in Cape Town received for his $14. He got grocery money. The company received footage it can use to train navigation systems, sell to car manufacturers, licence to defence contractors, and build upon indefinitely. The value of that footage does not depreciate. It compounds. The $14 is gone in a week.
This is the maths of the new gig economy. A young person in India earns $100 a month making their city audible to American AI systems. A teenager in Chicago sells his mother's voice for $0.50 a minute. An actor earns $1,000 and loses control of his face forever. The platforms describe this as opportunity. It looks more like extraction.
What you can actually do about this
If you are considering these platforms because you need the money, I am not going to lecture you. I grew up in a family where $14 was real money. But here is what you should know before you sign:
- Read the licence agreement. Look for the words 'irrevocable', 'royalty-free', and 'derivative works'. If those appear together, you are giving up rights permanently.
- Check whether the company has a data deletion process. If they do not list one, assume they cannot or will not delete your data.
- Search the company name plus 'security breach' or 'data leak' before uploading anything. Neon Mobile's collapse is not unique.
- Consider what you are selling. A photo of your street is different from a recording of your voice is different from footage of your face. Biometric data cannot be changed the way a password can.
- Ask yourself who else is in the recording. Ramelio sold his family's voices. Did his mother consent?
The platforms will not tell you this. They will tell you it is easy money. And it is easy, in the same way that payday loans are easy.
The comparison is not perfect, but the structure is similar. Someone needs money now. A company offers money now in exchange for something that costs more later. The person in need has limited bargaining power. The terms favour the company. The consequences arrive after the contract is signed.
I have seen this pattern before, in debt traps and insurance fine print and buy-now-pay-later schemes. The difference is that this time, the collateral is not your credit score. It is your face, your voice, your identity. And unlike a debt, you cannot pay it off.
SOURCES & CITATIONS
FREQUENTLY ASKED QUESTIONS



