Stuff You Should Know

Data Centers: Can't Live With Em, Can't Live Without Em

January 15, 2026

Key Takeaways Copied to clipboard!

  • Global data consumption is growing exponentially, reaching 150 zettabytes in 2024, necessitating a massive expansion of data centers. 
  • The evolution of data centers progressed from early military computers like Colossus and ENIAC to mainframes, personal computers, and finally to cloud computing, which paved the way for modern hyperscale facilities. 
  • The current AI boom is driving unprecedented demand for specialized hardware, particularly GPUs, leading to massive corporate investment in new data centers while simultaneously creating significant environmental concerns regarding electricity and water usage. 

Segments

Podcast Introduction and Context
Copied to clipboard!
(00:00:45)
  • Key Takeaway: The hosts of Stuff You Should Know introduce the episode topic: data centers, acknowledging their foundational role in the digital world and the current environmental concerns surrounding their expansion.
  • Summary: The episode of Stuff You Should Know focuses on data centers, which underpin all websites and social media activity. The hosts acknowledge the current boom in these facilities, driven by AI, and hint at the environmental toll they impose. The introduction also features light banter regarding podcast editing habits.
Data Consumption Scale
Copied to clipboard!
(00:03:40)
  • Key Takeaway: Global data consumption reached 150 zettabytes in 2024, a massive increase from just two zettabytes consumed in 2010, driving exponential growth in data center needs.
  • Summary: Data consumption includes all digital activities, from uploading videos to making credit card purchases. A zettabyte is defined as one trillion gigabytes. This exponential growth in data usage is forcing data centers into hyperdrive expansion.
Early Computing History
Copied to clipboard!
(00:06:21)
  • Key Takeaway: The Colossus, the first programmable electric digital computer used during WWII at Bletchley Park, represents an early form of data processing facility.
  • Summary: Early data centers were essentially large electronic computers built with vacuum tubes and manual switches, often serving military purposes. The ENIAC was the first general-purpose electronic computer, focused on data processing rather than long-term storage. The Colossus machine is now housed at the National Museum of Computing in the UK.
Mainframes and Early Business Use
Copied to clipboard!
(00:09:19)
  • Key Takeaway: Mainframes, originating from cabinets housing telecommunication equipment in the 1950s, became centralized data processing hubs for companies, requiring physical presence for access.
  • Summary: The first commercial mainframe user was the UK’s Lyons T-shop chain with its LEO computer in 1951, which calculated missile trajectories for the Ministry of Defense. IBM became a leader in the 1950s, leasing units for significant monthly costs, such as $16,000 in 1952 money. Mainframes remain in use today by entities like Visa and healthcare companies due to their high reliability and security.
Internet and Cloud Computing Impact
Copied to clipboard!
(00:15:46)
  • Key Takeaway: The rise of the internet and subsequent cloud computing in the early 2000s democratized data center access, allowing smaller businesses and individuals to store data remotely.
  • Summary: The internet spurred the growth of data centers to support e-commerce, leading to the dot-com bust and subsequent stabilization. Cloud computing was a major game-changer, meaning data was stored off-site by providers like Amazon Web Services. This enabled business models like Dropbox, where consumers pay a flat fee for remote storage capacity.
AI Data Center Requirements
Copied to clipboard!
(00:23:01)
  • Key Takeaway: AI data centers require exponentially more computing power than traditional infrastructure, relying on thousands of linked Graphics Processing Units (GPUs) like the NVIDIA H100 for parallel processing.
  • Summary: The release of ChatGPT in 2022 accelerated the need for massive, fast-built data centers. AI training, such as for ChatGPT, utilized around 20,000 GPUs. NVIDIA’s stock surged due to the demand for these specialized chips, which are essential for the parallel processing capabilities AI requires.
Investment and Bubble Concerns
Copied to clipboard!
(00:27:20)
  • Key Takeaway: Massive corporate investment, projected at $3 trillion globally by 2030 for data centers, is occurring despite only 5% of current AI programs securing returns on investment, raising concerns about an AI bubble.
  • Summary: Microsoft, Amazon, and Google/Meta are pledging hundreds of billions toward data center expansion over the next decade. This investment is partly fueled by private credit, which is largely unregulated and mirrors risky debt speculation seen before the 2008 meltdown. Warnings about a potential AI bubble are being issued by bodies including the IMF.
Environmental and Local Impact
Copied to clipboard!
(00:36:00)
  • Key Takeaway: Hyperscale data centers consume energy equivalent to a town of 50,000 people and use vast amounts of water for evaporative cooling, often leading to local resource scarcity and increased electricity prices for residents.
  • Summary: Data centers currently use 1 to 1.5% of the world’s electricity, with projections suggesting US data center energy use could hit 13% of national demand by 2030. Cooling these facilities generates significant heat, necessitating water-intensive evaporative cooling systems. In areas like Northern Virginia’s Data Center Alley, electricity prices have risen 267% since 2020 due to this massive demand.
Jobs and Government Subsidies
Copied to clipboard!
(00:44:05)
  • Key Takeaway: Despite massive investment, large data centers create relatively few local jobs (e.g., 400 jobs for a £10 billion UK center), while local governments heavily subsidize these operations, often at the expense of local resources.
  • Summary: The economic benefits for local communities from data center construction are often minimal in terms of employment due to high automation. Profits from international data centers frequently flow back to the parent company’s home country. Governments are criticized for giving these corporations whatever they want without imposing necessary checks or balances.
Listener Mail Segment
Copied to clipboard!
(00:46:56)
  • Key Takeaway: The BBC has maintained a long-standing, beneficial partnership with the Open University (OU) since the 1970s to facilitate accessible, university-level education via broadcast media.
  • Summary: The BBC and Open University partnership co-produces high-quality, informed content across TV, radio, and online platforms, including nature programming. The OU creates supplementary materials to extend the learning journey for viewers and listeners. This collaboration enables the public to access specialist knowledge in accessible formats.