11 min 46 sec

Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence

By Kate Crawford

Atlas of AI explores the hidden material, environmental, and social costs of artificial intelligence, revealing how this seemingly abstract technology relies on intense resource extraction and systemic labor exploitation.

Table of Content

In our modern world, artificial intelligence is often spoken of as something magical—a form of disembodied, digital consciousness that lives in the ‘cloud’ and solves complex problems with superhuman speed. We see it in our smartphones, our social media feeds, and the growing list of automated services we use every day. But if we want to truly understand what AI is, we have to look past the sleek interfaces and the marketing buzzwords. We need to look at the physical world.

Think of this journey as an exploration of a vast, global map—an atlas that charts the territory of technology not through lines of code, but through the physical resources, human energy, and political power that make it possible. To build a single AI model, the industry requires massive amounts of minerals dug from the earth, vast quantities of water and electricity to cool and power servers, and the labor of millions of people who remain largely invisible.

This summary will take you from the desolate salt flats of the desert to the high-pressure environments of data labeling factories. We will see how AI is less an autonomous ‘intelligence’ and more a massive industrial system of extraction. By tracing the throughline from 19th-century pseudo-science to modern machine learning, we will uncover the true cost of our digital age and ask ourselves what kind of future we are actually building. It’s time to pull back the curtain on the artificial and look at the very real systems of power at play.

Discover why the term ‘intelligence’ might be a misnomer for modern software and how a famous 19th-century horse reveals our tendency to over-attribute human traits to machines.

Explore the physical reality of digital technology, from the lithium mines in the Nevada desert to the environmental toll of maintaining massive server farms.

Uncover the ‘ghost work’ that powers modern technology, where millions of low-paid workers perform the repetitive tasks necessary to make machines seem smart.

Learn how the way AI categorizes people can reinforce old prejudices and how modern data sets are sometimes based on outdated and harmful scientific theories.

As we have seen, the map of artificial intelligence is much larger and more complex than the industry’s marketing would suggest. It is an atlas that spans from the depths of the earth to the intricacies of human social life. We have uncovered how the illusion of intelligence masks a system of data and pattern matching, how the digital cloud rests on a foundation of environmental destruction, and how the invisible labor of millions is used to power ‘automated’ services. Most importantly, we have explored how the way AI classifies us can reinforce systemic inequalities and take us back to outdated ways of thinking about human identity.

What this really means is that AI is not an inevitable force of nature or a neutral tool. It is a system of power, built by specific people and corporations for specific ends. The throughline here is clear: the hidden costs of AI—the environmental, social, and political tolls—are currently being borne by the many for the benefit of the few.

Moving forward, we must stop viewing AI as a purely technical challenge and start seeing it as a social and ethical one. Actionable change starts with demanding transparency about where the materials for our devices come from, how our data is being used, and who is being harmed by the algorithms used in policing, hiring, and healthcare. We must shift the focus from ‘innovation at any cost’ to a model that prioritizes sustainability, labor rights, and human dignity. By understanding the true atlas of AI, we gain the knowledge needed to demand a technology that serves everyone, not just the systems of power that created it.

About this book

What is this book about?

When we think of artificial intelligence, we often imagine ethereal code floating in a digital cloud. However, this book reveals the heavy physical reality behind the algorithms. It takes listeners on a journey from the lithium mines of Nevada to the data centers consuming massive amounts of energy, illustrating that AI is not a disembodied force but a deeply material industry. The book promises to strip away the marketing hype surrounding 'intelligent' machines to show how these systems are actually built on the backs of low-paid workers and stolen data. By examining the history of classification and the current race for global dominance, it provides a critical framework for understanding the profound influence AI has on our environment, our privacy, and our social structures.

Book Information

About the Author

Kate Crawford

Kate Crawford is an author and scholar who studies the social implications of AI. She has held research positions at the USC Annenberg School, Microsoft Research, and the École Normale Supérieure.

Ratings & Reviews

Ratings at a glance

3.4

Overall score based on 145 ratings.

What people think

Listeners describe the work as extensively researched and packed with granular data regarding artificial intelligence. Furthermore, they value its thoroughness; one listener pointed out that the author handles intricate concepts while maintaining excellent clarity. In contrast, opinions on how easy it is to read are divided, as some consider it exceptional while others find it a bit of a slog. The author’s specific voice and prose style also draw a variety of responses from listeners.

Top reviews

Kom

This book is a necessary reality check for anyone seduced by the magical thinking surrounding algorithms. Crawford does a brilliant job of pulling back the curtain on the physical reality of the 'cloud,' showing us that it isn't some ethereal space but a massive consumer of energy and minerals. I was particularly struck by the detailed mapping of the lithium mines in Nevada. It’s haunting to think that my smart devices are tied to that level of environmental degradation. While the writing is dense, the way she connects 19th-century mining practices to modern server farms is incredibly clear. You won't look at your smartphone the same way after reading this. It’s an essential, well-researched critique of how power concentrates in the hands of the tech elite.

Show more
Pairot

Crawford masterfully deconstructs the digital 'cloud' as a physical, resource-hungry entity rather than an abstract mathematical concept. Most people think of AI as code, but this book proves it is also lithium, water, and exploited human labor. The narrative is chilling, especially the sections on how facial recognition is used by the state to monitor marginalized populations. I loved the emphasis on how classification systems aren't neutral but are actually encoded with the biases of their creators. The sentence structure is varied enough to keep the academic material engaging without sacrificing the seriousness of the subject matter. It’s a grim read, but it’s probably the most important tech book released in the last decade. Absolute mandatory reading for the 'tech dudebros' who ignore these costs.

Show more
Yanin

Finally got around to reading this, and it’s essentially a manifesto against techno-optimism that we desperately needed. Crawford exposes the hidden costs of our convenience, from the radioactive waste of mineral extraction to the 'ghost work' of content moderators. The way she traces the history of eugenics into modern data classification is both brilliant and deeply disturbing. Not gonna lie, it’s a bit of a depressing read, but the truth usually is. The book is well-researched and filled with shocking statistics that stay with you long after you finish. It’s the perfect antidote to the marketing fluff coming out of Silicon Valley. If you care about the future of the planet and human rights, buy this book immediately.

Show more
Bella

Ever wonder why we are still asked to identify traffic lights or crosswalks to prove we are human? Crawford’s chapter on labor explains this perfectly by highlighting the 'Potemkin AI' phenomenon, where low-paid workers are actually the ones doing the heavy lifting for systems that claim to be fully automated. The book is filled with these kinds of eye-opening insights. I found the discussion on the 'Clever Hans' effect particularly relevant to today's LLM hype. My only real gripe is that the tone can get a bit pedantic, and the author doesn't offer much in the way of solutions or positive alternatives. Still, for a deep dive into the politics of data, this is hard to beat. It’s a sobering look at who really wins in the AI race.

Show more
Teng

Picked this up after hearing about the 'Clever Hans' analogy in a podcast, and I’m glad I did. The book moves from the earth's crust to the halls of government power, showing how AI is less about intelligence and more about a new form of planetary extraction. The chapter on Affect was a standout for me, debunking the pseudoscience of reading emotions from facial expressions. It’s terrifying that companies are actually using this stuff for hiring decisions. While I agree with other reviewers that the book can be a bit repetitive, the sheer volume of detailed evidence Crawford provides is impressive. She doesn't just make claims; she backs them up with extensive field research. It’s a dense read but ultimately worth the effort for the perspective shift.

Show more
Wachira

The chapter on classification was particularly chilling for me as someone who works in data science. It’s uncomfortable to realize how the categories we build—gender, race, even 'normalcy'—are often just reflections of historical prejudices. Crawford doesn't pull any punches when describing how these systems are used by the state for surveillance. I appreciated how she linked the Snowden leaks to the current business models of companies like Palantir and Amazon. The writing is sophisticated, and while it gets a bit academic at times, the core message is always clear. It’s an unsettling dive into the power structures that dictate our digital lives. I would have liked more discussion on how to build better systems, but as a critique, it's very powerful.

Show more
Samart

As someone who appreciates the 'atlas' metaphor, I found the geographical approach to AI's impact really unique and effective. Crawford doesn't just talk about code; she takes us to the sites where AI is physically made and used. The complexity of the ideas is presented without sacrificing clarity, which is a rare feat in academic writing. To be fair, the book does get a bit winding in the middle, and I found the chapter on the state to be a bit of a retread of other surveillance literature. However, the overarching argument that AI is a register of power is undeniable. It’s a thought-provoking, clear-eyed analysis of the negative aspects lurking under the surface of our high-tech world. A solid four-star read for anyone interested in STS.

Show more
Vilaiporn

I wanted to learn about the actual mechanics of machine learning, but instead, I got a sprawling history lesson on mineral mines and 19th-century labor politics. To be fair, Crawford’s research is exhaustive and the information is objectively important. However, the definition of AI used here is so broad that it eventually loses its meaning. Sometimes she’s talking about software, and other times she’s talking about the entire global capitalist infrastructure. This makes the core argument feel a bit diluted at times. I appreciated the sections on how data scraping dehumanizes individuals, but the constant pivoting between ethnographic field trips and Wikipedia-style historical tangents made for a disjointed reading experience. It’s a good book, just maybe not the one I was expecting given the title.

Show more
Hazel

The research here is undeniably deep, yet the prose often feels like wading through thick mud. I found myself checking the bibliography more than the actual text because the citations are almost more interesting than the narrative synthesis. To be frankly honest, the book repeats its central thesis—that AI is an extractive industry—so many times that it becomes tedious by the third chapter. There is a lot of valuable information regarding the Enron corpus and the ethics of mugshot databases, which are points everyone in tech should consider. But the lack of a clear, consistent definition of what constitutes AI makes it easy for the author to blame 'AI' for every societal ill since the industrial revolution. It's a mixed bag for me.

Show more
Lucia

Frankly, this was a massive disappointment given the hype surrounding it as a foundational text in algorithmic ethics. The book reads more like a collection of disconnected anecdotes than a coherent argument about artificial intelligence. My biggest issue is that Crawford spends more time on 18th-century philosophy and mineral mining than on how modern models actually function. If you swap 'AI' with 'Industrialization' or 'Global Trade,' most of her arguments remain exactly the same. It feels like she’s using the buzzword of the moment to repackage standard Marxist critiques. The lack of nuance regarding the potential benefits of the technology makes the whole thing feel like a one-sided polemic. If you're looking for a balanced view, you won't find it here.

Show more
Show all reviews

AUDIO SUMMARY AVAILABLE

Listen to Atlas of AI in 15 minutes

Get the key ideas from Atlas of AI by Kate Crawford — plus 5,000+ more titles. In English and Thai.

✓ 5,000+ titles
✓ Listen as much as you want
✓ English & Thai
✓ Cancel anytime

  • book cover
  • book cover
  • book cover
  • book cover
  • book cover
  • book cover
  • book cover
  • book cover
  • book cover
  • book cover
  • book cover
  • book cover
  • book cover
  • book cover
  • book cover
  • book cover
Home

Search

Discover

Favorites

Profile