Artificial intelligence (AI) is increasingly being regarded as a transformative technological revolution, with the potential to tackle significant global challenges such as disease cure, the automation of dangerous jobs, and the innovation of new inventions. The prospects associated with AI development are widely discussed, and while its advocates argue for continued investments, concerns have been raised regarding the tangible benefits it has thus far delivered to the American public.
A survey conducted in August 2024 by the Federal Reserve and Harvard Kennedy School reveals that AI's current usage remains limited primarily to a small, highly educated demographic. Of U.S. adults aged 18-64, approximately 39.4% reported engaging with generative AI technologies; however, these figures vary notably across educational backgrounds. For instance, those boasting a bachelor's degree or higher are twice as likely to harness the capabilities of AI in their work compared to individuals without a university degree. Specifically, 40% of college-educated workers use AI, compared to just 20% of those lacking higher education. The highest adoption rates appear in sectors like computer/mathematical occupations and management roles, with figures of 49.6% and 49.0% respectively. In contrast, significantly lower percentages of AI adoption are seen among personal services and blue-collar workers, standing at 12.5% and 22.1% respectively.
While the potential rewards of AI might still be speculative, its associated costs are becoming increasingly evident. Reports suggest that a disproportionate share of these costs is being borne by the American West, where both natural resources and local communities are under pressure due to the escalating demands of AI development.
Data centres, vital for AI functionality, accounted for approximately 3% of the United States' total electricity consumption in 2022, a figure expected to soar to 9% by the year 2030. Such a surge in energy consumption is particularly pronounced in the Western United States, which is home to a concentration of technology hubs and data facilities. Projections from the International Energy Agency warn that carbon dioxide emissions from these data centres may more than double between 2022 and 2030, compounding the environmental implications for the region.
The immense energy requirements integral to AI development stem from the advanced computational power needed for machine learning. Current estimates indicate that global electricity consumption by data centres, cryptocurrencies, and AI could reach between 620 to 1,050 trillion watt-hours by 2026, a quantity sufficient to power approximately 94.3 million American homes for a year.
This growing demand for energy, predominantly sourced from the already strained resources of the American West, raises serious concerns regarding land use, water diversion, and community impacts. Those areas that are essential for fostering AI's evolution may ironically be among the least likely to reap its benefits.
The situation increasingly mirrors past "get rich quick" schemes that have led to the exploitation of rural communities, with resource extraction often resulting in long-lasting environmental and social challenges. Criticism about the ongoing energy and resource demands of AI development has prompted discussions around the necessity for accountability and transparency within the sector. Key questions arise concerning the source of energy being harnessed, the communities affected, and who stands to truly gain from these advancements in AI technology.
Despite the extraordinary future AI potentially holds, it remains crucial to ensure that the pursuit of its development does not come at the expense of present-day resources or the well-being of communities across the region. As the conversation surrounding AI continues to evolve, so too does the pressing need to address the associated impacts on local environments and populations, ensuring a more equitable division of its future benefits.
Source: Noah Wire Services