The growth of AI depends on a vast physical infrastructure. In 2024, data centres worldwide consumed around 415 terawatt-hours of electricity, about 1.5% of total global consumption, which represents an average rise of 13% annually since 2020. Efficiency gains have tempered growth in power use – for example, despite a 550% increase in US computing workloads from 2010–18, US electricity consumption rose by only 6%. However, overall demand continues to surge. As computing becomes more efficient and accessible, it also becomes cheaper and more widely used, driving up total energy consumption. In response, countries such as China, Singapore and the Netherlands have introduced measures to balance economic growth with environmental limits on data-centre expansion, including temporary construction moratoriums and Power Usage Effectiveness (PUE) requirements to improve energy efficiency. Hyperscale AI facilities are generally more efficient than those of smaller enterprises, yet a single hyperscale facility can have a capacity of 100 megawatts or more, consuming as much power annually as 100,000 households. Chip manufacturing adds further strain as each new generation of GPUs and server-grade semiconductors – smaller, faster and more complex than the last – requires more energy per wafer to produce. Training frontier systems can consume hundreds of gigawatt-hours of electricity, while inference costs, though variable, continue to add substantially to overall energy demand. Water is a parallel concern, and here too efficiency does not automatically mean a lower impact. Roughly three-quarters of a data centre’s water footprint is indirect, coming from its electricity consumption, but direct use for cooling can still place real strain on local resources, particularly in regions already facing scarcity. As data centres expand globally, their water usage adds up even if that of individual centres is modest, and efficiency improvements can encourage further growth overall, mirroring the effect seen with energy. Local conditions matter, as volumetric totals can be misleading if scarcity and competing community needs are not taken into account. This has already sparked community opposition and regulatory pressure in water-stressed areas in Chile, Uruguay and the United States, prompting firms to redesign facilities or relocate projects. Shifting to 100% renewable energy could reduce overall water footprints by more than half, but such transitions take investment and time. Carbon emissions mirror these dynamics, with estimates that generative AI could add tens to hundreds of millions of tonnes of CO₂ annually by 2035 if left unmanaged. Scarcity of the required skills compounds these pressures, especially for firms developing advanced AI systems. Elite machine-learning researchers can now command multi-million-dollar packages, in some cases receiving signing-on bonuses exceeding US$100 million as companies compete for a limited pool of expertise. While few firms disclose full research and development (R&D) spending, public estimates suggest that building frontier-scale models can cost tens to hundreds of millions of dollars, even though recent entrant DeepSeek claims to be able to train competitive models for under US$300,000, a figure that is contested. For firms that use rather than build AI, the challenge is different but no less significant – training staff, integrating systems and ensuring their effective use are all processes that demand new investment and expertise. Beyond the need for AI-related skills, AI is shaping labour markets in other ways. The UK estimates that 10–30% of current jobs are automatable with AI, with professional occupations in finance, law, business management and education the most exposed. However, other studies differ on the sectors most likely to be affected and the probable level of disruption across the economy as a whole. Entry-level roles are declining in AI-user sectors, even as new opportunities emerge in developing and deploying frontier systems. These pressures will only become more visible. In addition, adopting AI effectively requires an AI-literate workforce capable of using tools safely and productively, without leaking sensitive data or misapplying outputs, and AI-proficient engineers who can tailor models, build integrations and rework back-office processes to capture real productivity gains. These skills are increasingly prerequisites for competitiveness and come with their own business costs. For any country hoping to prosper in the AI era, all types of skills matter, including those of elite researchers, proficient engineers who integrate and maintain the systems, and data-literate workers who use AI effectively and safely. Each underpins a different layer of the ecosystem, from frontier research to responsible deployment to broad workforce readiness. As with energy and water, the increasing demand for talent will become more evident, and for consumers it will manifest in higher subscription fees, premium pricing for advanced features and slower rollout of new capabilities as companies absorb or pass on the rising costs of remunerating highly skilled employees.
|