Training large language models like GPT-3 requires significant computational resources, consuming over 1.25 billion watt-hours of energy.Data storage for AI models can require petabytes of storage, consuming significant energy and power.The network infrastructure and data transfer also contribute to the environmental impact of AI.AI development efforts and office spaces contribute to the overall energy footprint.