Feeding the Machine: How AI’s Rise Depends on Low‑paid Labor and Vast Natural Resources

James Muldoon’s reporting reframes generative AI as a large‑scale extraction system that depends on low‑paid labour, unconsented creative material and vast energy and water resources. The phenomenon deepens global labour competition, concentrates managerial control, and risks reproducing Western cultural biases unless regulated.

Abstract black and white graphic featuring a multimodal model pattern with various shapes.

Key Takeaways

  • 1AI relies heavily on human labour and creative works—content moderators, data labelers and artists are central inputs often without fair compensation.
  • 2Training large models consumes massive electricity and water, pushing firms toward long‑term energy contracts and reshaping local resource markets.
  • 3Digital connectivity has made it cheap to relocate AI tasks globally, intensifying competition and downward pressure on wages and conditions.
  • 4China largely keeps data‑label work domestic through close government‑industry ties, a corporatist contrast to the US outsourcing model.
  • 5Generative models embed linguistic and cultural biases from their datasets, risking a form of data colonialism that marginalises non‑Western knowledge.

Editor's
Desk

Strategic Analysis

Muldoon’s account shifts the frame for policymaking: regulating AI cannot be limited to model performance or narrow safety tests but must address the upstream chain of data collection, labour practices and energy infrastructure. Expect legal and political pressure on copyright rules, energy planning and cross‑border labour standards; intellectual property litigation (already emerging in Europe and the US) may force business model adjustments, while countries with scarce water or electricity will face hard choices about hosting energy‑hungry facilities. Strategically, democracies that care about informational pluralism should fund digitisation of underrepresented languages and knowledge systems and consider industrial policies that prevent extraction‑driven dependency on foreign tech firms.

China Daily Brief Editorial
Strategic Insight
China Daily Brief

James Muldoon’s new field study dispels the romantic myth of a self‑sufficient, autonomous artificial intelligence. Drawing on interviews from Kenyan content moderators to Icelandic data‑centre managers and Irish voice artists, his book reframes generative AI not as a mysterious invention but as a global extraction system that consumes human labour, creativity and environmental inputs.

Muldoon documents concrete human costs behind the lines of code: moderators who repeatedly view violent imagery until traumatised, voice actors whose recordings were repurposed without consent, and armies of data‑labelers in low‑cost regions preparing datasets. He also traces the physical infrastructure — energy‑hungry data centres, cooling systems and transcontinental fibre — that keeps large language models running, emphasising that models’ outputs are built on material foundations rather than ethereal algorithms.

The extraction Muldoon describes sits on a long history of capitalist outsourcing and platform work, but with important intensifications. Unlike earlier waves of offshored manufacturing or call‑centre labour, AI systems absorb entire canons of creative work at scale: millions of books, articles, images and songs are harvested to generate products that are then monetised by firms, often without authorial consent or compensation.

That dynamic is enabled by modern connectivity and automated supply chains. Muldoon highlights milestones such as the arrival of high‑capacity fibre to East Africa, which opened a new pool of remote labour able to perform annotation and moderation for western platforms. Because training and labeling tasks can be routed digitally, the marginal cost of shifting labour between providers is low, producing a relentless global competition that pressures wages and working conditions.

Energy and water footprints amplify the geopolitical stakes. Data centres running AI workloads can consume water at the scale of a small town and require continuous, large‑scale electricity purchases. Muldoon reports that racks running training workloads can demand several times the power of earlier enterprise customers, prompting tech firms to sign long‑term power purchase agreements or invest directly in generation — moves that reconfigure local energy markets and raise environmental concerns.

The labour consequences are paradoxical. Advanced robotics remain expensive and context‑limited, so workers remain indispensable; yet AI tools are already deskilling white‑collar roles and compressing pay. Management retains systemic knowledge and control while frontline workers become interchangeable cogs, subject to faster output quotas and precarious contracts. The result is not wholesale automation but intensified exploitation and inequality within workplaces.

China sits at the centre of this contested geography but follows a distinctive path. Whereas US firms often offshore annotation to cheaper foreign providers, China tends to internalise data‑label work within its domestic regions, aligning private firms with provincial development policies and state actors. That corporatist model reduces the leakage of sensitive data but also channels labour and infrastructural burdens into inland provinces under government‑guided industrial policy.

Muldoon warns of a quieter form of cultural domination: data colonialism. Large models trained primarily on English‑language, western‑centric content reproduce and amplify those worldviews, marginalising knowledge and practices that never entered the digitised record. For users worldwide, that means AI will not just automate tasks but shape which facts and narratives are readily accessible.

The policy implications are wide: copyright litigation is already testing the industry’s business model, energy planning must reckon with massive new loads, and labour regulation needs to catch up with digitally distributed, highly monitorable work. Muldoon’s account argues that understanding these supply chains is a necessary first step for any regulatory or collective‑action strategy that seeks to make AI’s gains less extractive and more equitable.

The book reframes the central AI question from ‘‘can machines think?’’ to ‘‘who pays the cost of machine thinking?’’ For governments, unions and civil society, the immediate task is to map the hidden inputs — human, material and ecological — that produce today’s models, and to decide how the benefits should be distributed.

Share Article

Related Articles

📰
No related articles found