Artificial intelligence has quickly become part of everyday life. Students use it to generate ideas, workers rely on it to complete tasks faster, and companies continue to expand AI systems across industries. The technology feels instant, simple and easy to access.
What many people do not see, however, is the environmental impact behind that convenience. AI tools depend on large data centers filled with powerful servers running continuously. These systems require major amounts of electricity and water, raising questions about sustainability as AI use grows.
The International Energy Agency reported that data centers worldwide consumed about 460 terawatt-hours of electricity in 2022. The agency also projects that electricity demand from data centers could more than double by 2026, driven partly by the rapid growth of AI applications (International Energy Agency, “Electricity 2024”).
Electricity use matters because much of the world still depends on fossil fuels for energy production. As AI becomes more widespread, the energy required to support it adds pressure to power grids and increases emissions in many regions.
Water use is another concern that is often overlooked. Data centers generate significant heat, and many facilities rely on water-based cooling systems to prevent overheating. Researchers at the University of California, Riverside, examined AI’s water footprint and found that even short AI interactions can indirectly consume water depending on where the data centers operate and how cooling is managed (Li et al., 2023, “Making AI Less Thirsty,” UC Riverside).
Technology companies have also acknowledged these growing demands. Microsoft reported that its total water consumption increased in 2022, partly due to expanding data center operations to support AI workloads (Microsoft Environmental Sustainability Report, 2023).
None of this means artificial intelligence should be dismissed. AI has useful applications in research, education, and efficiency, and it may even contribute to sustainability solutions in areas such as energy forecasting or climate science. At the same time, it is important to recognize that these tools come with real environmental costs.
AI is often presented as invisible and effortless, but it depends on physical infrastructure, constant computing, and large-scale resource use. Most users never think about the electricity or water required behind a simple prompt.
As AI becomes more integrated into daily routines, the conversation should expand beyond innovation alone. Companies should be transparent about resource consumption, and researchers should continue developing more efficient systems. Users should also think critically about how often and why these tools are being used.
Artificial intelligence is shaping the future, but sustainability must remain part of that future as well. Convenience should not replace responsibility. If AI continues to grow, its environmental footprint should remain part of the public discussion.


