May 2, 2025
๐๐๐ฒ๐จ๐ง๐ ๐ญ๐ก๐ ๐๐ฒ๐ฉ๐: ๐๐ก๐ ๐๐ง๐ฏ๐ข๐ฌ๐ข๐๐ฅ๐ ๐๐จ๐ฌ๐ญ ๐จ๐ ๐๐ ๐๐ฎ๐ซ๐ข๐จ๐ฌ๐ข๐ญ๐ฒ

The player is loading ...

In this episode, we explore the measurable expenses of pushing AI boundaries, from CO2 emissions to workforce stress, drawing on recent research.
Key Learnings:
- The energy cost of LLMs, with training and usage emitting thousands of tons of CO2, comparable to powering entire countries by 2026.
- Career impacts, including stress and adaptation demands on professionals navigating automation and new roles.
- Operational challenges, such as mitigating hallucination and bias, requiring resource-intensive techniques like retrieval methods and dataset curation.
- The financial scale of AI investment, with $1 trillion at risk, contrasted by potential environmental gains if experimentation focuses on efficiency.
- The importance of evaluating these costs independently to understand their implications for AI/ML, data management, and career trajectories.
Speaker - โ Pritiโ โ Founder, Human in Loop Podcasts
Check the references below. Dig deeper if youโd like! We all see things our own way, so feel free to explore.
References:
- LLM Reliability (Rush Shahani).
- Carbon emissions of the ChatGPT usage: environmental impacts of the ChatGPT in different regions (Scientific Reports, 2024).
- Explained: Generative AIโs environmental impact (MIT News, January 2025).
- 9 top AI and machine learning trends to watch in 2025 (TechTarget, January 2025).
- Energy and Policy Considerations for Deep Learning in NLP (University of Massachusetts, 2019) โ Relevant for historical energy context.
- ChatGPT says our GPUs are melting as it puts limit on image generation requests (The Verge, March 2025) โ Relevant for GPU strain from trends.