1 min read

Link: A writer who works for a tech company describes how he helps train AI models how to write, by making up pretend responses to hypothetical chatbot questions (Jack Apollo George/The Guardian)

For hours each week, I produce content for a major tech company alongside accomplished novelists and academics. Surprisingly, our meticulously crafted works are destined for AI training, invisible to the public eye.

These AI systems, like ChatGPT, require vast amounts of nuanced human writing to train effectively. My job involves creating exemplary responses for these models, teaching them to recognize 'good' writing and avoid fabrications.

Researchers have marked the mid-2020s to early 2030s as the period when AIs could exhaust public human texts for training. There's concern that retraining AIs with their own generated content may lead to a deterioration in model quality, known as 'model collapse'.

Despite the risks, there's an industry-wide push to maintain a human element in AI development to ensure the models remain beneficial and accurate. High-paying roles for creating quality "gold-standard" training data are on the rise as companies recognize the critical need for meticulously curated inputs.

Yet, there's an inherent irony in being employed to advance technology that could potentially make our roles obsolete. My work helps refine AIs that are replacing traditional writing jobs, but it's financially rewarding enough to justify the continued effort.

It seems we are witnessing a burgeoning economic model that simultaneously values and undermines human writing. While technology continually advances, the demand for high-quality human-generated content for AI training appears secure for now. #

--

Yoooo, this is a quick note on a link that made me go, WTF? Find all past links here.