Cities around the world are racing to build powerful computers called data centers to power artificial intelligence technology, but this creates a big problem: these buildings use enormous amounts of electricity. This weekly update covers how cities are learning to be smarter about building AI infrastructure without breaking their electrical grids.

In Southern California, a small industrial town called Vernon with just over 200 people has become a hub for AI data centers. One building there uses as much electricity as 26,400 homes, and more companies like Google, Amazon, Microsoft, and Meta are investing hundreds of billions of dollars to build even more data centers across the United States. Vernon is attractive because it has cheap electricity from its own power company and few neighbors to complain about noise or other effects.

However, communities everywhere are fighting back against data center projects. A proposed data center in Westfield, Massachusetts would use four to five times more power than the entire city uses, but the project appears to have been abandoned by developers. Many towns worry that data centers will use too much water and electricity, which could raise prices for regular people.

City leaders are now focusing on smart design to reduce power waste. Instead of sending all information to giant cloud computers far away, engineers are putting smaller AI computers closer to sensors in cities, which saves energy. Cities are also choosing smaller, more efficient computer chips instead of the biggest, most powerful ones that waste energy. Some agencies are building local simulation labs where they can test their AI systems before using them in real life.

The big lesson is that good planning matters more than fancy technology. Cities that decide what problem they want to solve first, before buying expensive AI equipment, make much better decisions about their infrastructure.

Extended Coverage