From research papers to creating malicious software code, ChatGPT roams the Internet with its incredible ability to replace human intelligence with AI. It may seem like a fairy tale, but the human cost of innovative technology like ChatGPT is largely neglected. A recent investigation found that OpenAI’s AI models were reportedly outsourced, abused and taught by underpaid Kenyan workers.
reported that first post, time Research has revealed that this chatbot was developed with the help of a data labeling team in Kenya earning less than $2 an hour. The fact that workers have been exposed to the worst of the internet, especially the dark web, is a major cause for concern.
Reader discretion is advised. The descriptions below contain explicit details about child sexual abuse, bestiality, murder, suicide, torture, self-harm, and incest, and as a result workers are exposed to the darkest and most terrifying elements of the Internet. They also had to watch videos on the same topic, as discovered during the investigation.
Workers were paid $1 to $2 per hour or up to $170 per month to read hundreds of submissions per report.
San Francisco-based Sama oversees a team in Kenya, claiming it provides its employees with access to “professionally qualified and licensed mental health therapists” for both individual and group sessions. .
according to timeone of the employees tasked with analyzing such material and cleaning up ChatGPT’s resource pool suffered repeated visions after reading graphic descriptions of a man having sex with a dog. described it as “torture”.
Sama employees reportedly objected to the nature of the content they had to read, quickly ending the company’s relationship with OpenAI.
What’s strange is that the whistleblower claims that OpenAI backers knew about the issue. This suggests that very high investors in Microsoft and OpenAI knew about this. This includes Tesla and SpaceX CEO Elon Musk.
In a Twitter thread, OpenAI CEO Sam Altman said the move to AGI-level systems would be full of “terrifying moments” and “significant disruptions,” but that the potential benefits meant there were problems along the way. I said it’s worth resolving. .
Also Read: