Abstract
Excerpted From: Nina-Simone Edwards, Unveiling the Environmental Impact of Large Language Models on Indigenous Communities: A Call for Action and Liability, 38 Tulane Environmental Law Journal 1 (Winter, 2025) (117 Footnotes) (Full Document)
Large Language Models (LLMs) have become extremely popular over the last few years due to the mainstream introduction of LLMs such as ChatGPT. LLMs, artificial intelligence (AI) systems skilled at generating and understanding language, have been used to boost productivity and increase efficiency and have opened up a wide range of opportunities for companies to do things like analyze data or solve problems at a faster rate than what has been previously possible. For years, LLMs have been touted as ““transforming” fields such as science, and overwhelmingly impacting society as a whole. The innovation of LLMs, however, has an unfortunate impact on the climate.
Climate change refers to both natural and man-made shifts in temperatures and weather patterns. Human activities, such as burning fossil fuels, have been the main contributors to climate change and have increased emissions from the main greenhouse gases that contribute to climate change. Human activities have also caused all “observed warming ... since the pre industrial era ....” Global warming has already caused widespread, unprecedented changes. A decrease in rainfall, the disappearance of glaciers, and a rise in sea levels impact all humans on Earth, but Indigenous communities feel the effects of climate change more than other groups of people.
While there is no official definition of “Indigenous,” self-identification, culture, strong links to territories and natural resources, as well as historical ties to pre-colonial/settler societies are all key features. For the purposes of this Article, which focuses on the United States, “Indigenous” is used to refer to those who identify as American Indians, Alaska Natives, and Native Hawaiians. Indigenous peoples and communities (globally and within the United States) are diverse, but “there are common features that confer greater vulnerability to climate change impacts.”
Due to Indigenous communities' reliance on and connection to their land, climate change impacts those communities first, and often, harder. For example, “[o]ne study found that coastal Indigenous communities eat 15 times as much seafood as non-Indigenous people in the same country--food that is being heavily impacted by everything from pollution to warming waters and ocean acidification.”
This Article does not seek to address the horrible ramifications of climate change on Indigenous populations. Instead, this Article seeks to highlight an additional source of climate change, or at least a burden that is exacerbating existing harms: LLMs. These ever so popular LLMs, purported to revolutionize society, are not new to critique. However, the impact on the environment--and, by extension, on Indigenous communities--cannot be overstated.
This Article, in highlighting the environmental harms caused by LLMs, connects these powerful AI systems to Indigenous communities. Instead of simply bringing efficiency or productivity to these communities, LLMs are contributing to climate change and harming Indigenous communities on a variety of levels. The companies that build and deploy LLMs must be held liable for their negative impact on the environment and on Indigenous communities.
Part II of this Article discusses the environmental impacts of LLMs and how these impacts affect Indigenous communities. This part analyzes the connection between carbon emissions from LLMs and the broader climate-related challenges faced by Indigenous communities.
Part III examines the trust relationship between tribal nations and the federal government. In particular, this part explores how this relationship could give rise to an affirmative duty enforceable under the Due Process Clause, potentially obligating the federal government to control the carbon emissions generated by LLMs.
Part IV explores other pathways for liability. This part discusses calls to action that may help solve this issue, particularly those that have led to international agreements. It focuses specifically on efforts by a national nonprofit and Alaska Natives in discussing calls to action that have been initiated to discuss viable solutions.
Part IV also discusses another potential response to this issue: legislation. It draws on congressional legislation that has led to Environmental Protection Agency regulations and considers how similar legislative efforts could help alleviate the climate crisis that LLMs are contributing to. This part also looks at Alaskan legislation and discusses how both existing legislation and new legislation can be effective solutions for holding the companies that build and deploy LLMs accountable, both nationally and statewide.
Despite the risks, harms, and proposed avenues of liability that this Article highlights, LLMs, and the rapid pace of their development, have generated significant excitement. A recent Forbes article captures this sentiment: “As amazing as ChatGPT seems to us at the moment, it is a mere stepping stone to what comes next.” Addressing the environmental harms that LLMs like ChatGPT exacerbate is pivotal to ensuring that the “stepping stone” that comes next is not even more harmful to Indigenous communities.
[. . .]
LLMs are inarguably useful: They can generate and understand language, media, and code. However, as they grow in popularity and size, so do the environmental harms--just the training of an LLM could rival the carbon emissions of the full life spans of five average cars. Thus, while discussing the usefulness of LLMs, it is also important to highlight the impact on the environment, as well as those that are impacted the most. Indigenous communities, already feeling the first effects of climate change, should not suffer at the hands of innovative technology and their companies. Whether the trust relationship becomes reliable, or there are calls to action, legislative amendments, or regulatory expansions, the impact that LLMs have on Indigenous communities must be discussed, and companies that build and deploy LLMs must be held liable.
Nina-Simone Edwards, Senior Institute Associate, Georgetown Law Institute of Technology Law & Policy; J.D., Georgetown University Law Center.