The Staggering Ecological Impacts of Computation and the Cloud

This article is written by Steven Gonzalez Monserrate for The MIT Press Reader.

Screens brighten with the flow of words. Perhaps they are emails, hastily scrawled on smart devices, or emoji-laden messages exchanged between friends or families. On this same river of the digital, millions flock to binge their favorite television programming, to stream pornography, or enter the sprawling worlds of massively multiplayer online roleplaying games, or simply to look up the meaning of an obscure word or the location of the nearest COVID-19 testing center.

Whatever your query, desire, or purpose, the internet provides, and all of the complexity of everything from unboxing videos to do-it-yourself blogs are contained within infinitely complex strings of bits. As they travel across time and space at the speed of light, beneath our oceans in fiber optic cables thinner than human hairs, these dense packets of information, instructions for pixels or characters or frames encoded in ones and zeros, unravel to create the digital veneer before you now. The words you are reading are a point of entry into an ethereal realm that many call the “Cloud.”

While in technical parlance the “Cloud” might refer to the pooling of computing resources over a network, in popular culture, “Cloud” has come to signify and encompass the full gamut of infrastructures that make online activity possible, everything from Instagram to Hulu to Google Drive. Like a puffy cumulus drifting across a clear blue sky, refusing to maintain a solid shape or form, the Cloud of the digital is elusive, its inner workings largely mysterious to the wider public, an example of what MIT cybernetician Norbert Weiner once called a “black box.” But just as the clouds above us, however formless or ethereal they may appear to be, are in fact made of matter, the Cloud of the digital is also relentlessly material.

To get at the matter of the Cloud we must unravel the coils of coaxial cables, fiber optic tubes, cellular towers, air conditioners, power distribution units, transformers, water pipes, computer servers, and more. We must attend to its material flows of electricity, water, air, heat, metals, minerals, and rare earth elements that undergird our digital lives. In this way, the Cloud is not only material, but is also an ecological force. As it continues to expand, its environmental impact increases, even as the engineers, technicians, and executives behind its infrastructures strive to balance profitability with sustainability. Nowhere is this dilemma more visible than in the walls of the infrastructures where the content of the Cloud lives: the factory-libraries where data is stored and computational power is pooled to keep our cloud applications afloat.

Cloud the Carbonivore

It is four in the morning when the incident occurs. At that moment, I am crouched on the floor of one of the containment aisles of the data center, computers arrayed like book stacks in a library on either side of me. The clamor of server fans makes it nearly impossible for me to hear Tom, the senior technician I am shadowing, explain to me how to pry open a faulty floor tile. With a specialized tool, I remove the white square tile from its hinges, noticing tiny perforations etched on its surface, points of ingress designed to help cool air rush up from a vast, pressurized cavity beneath us called a “plenum.” I set the tile aside, feeling a rush of cold tickle my nose as a gust of chill whips up from the exposed underfloor plenum. I go about replacing the tile, using one with more notches to improve airflow to this particular cluster of dense computing equipment. That is when I hear the alarms go off. Amid a sea of blinking green and blue lights, an entire rack of computers suddenly scintillates yellow, and then, after a few seconds, a foreboding red. In that instant, panic sweeps over Tom’s face, and he too is flush and crimson as he scrambles to contain the calamity unfolding around us.

“They’re overheating,” Tom says, upon inspecting the thermal sensors, sweat dripping from his brow.

I feel the heat swarming the air. The flood of warmth seeps into the servers faster than the heat sinks printed onto their circuit boards can abate, faster than the fans can expel the hot air recycling in a runaway feedback loop of warming. The automatic shutdown sequence begins, and Tom curses, reminding me that every minute of downtime, of service interruption, may cost the company many thousands of dollars. Within two minutes, however, the three massive air conditioning units that had been idling in a standby state activate to full power, flooding the room with an arctic chill and restoring order to the chaotic scene.

In the vignette above, which draws on my ethnographic fieldnotes, I recount an episode that data center technicians refer to as a “thermal runaway event,” a cascading failure of cooling systems that interrupts the functioning of the servers that process, store, and facilitate everything online. The molecular frictions of digital industry, as this example shows, proliferate as unruly heat. The flotsam and jetsam of our digital queries and transactions, the flurry of electrons flitting about, warm the medium of air. Heat is the waste product of computation, and if left unchecked, it becomes a foil to the workings of digital civilization. Heat must therefore be relentlessly abated to keep the engine of the digital thrumming in a constant state, 24 hours a day, every day.

To quell this thermodynamic threat, data centers overwhelmingly rely on air conditioning, a mechanical process that refrigerates the gaseous medium of air, so that it can displace or lift perilous heat away from computers. Today, power-hungry computer room air conditioners (CRACs) or computer room air handlers (CRAHs) are staples of even the most advanced data centers. In North America, most data centers draw power from “dirty” electricity grids, especially in Virginia’s “data center alley,” the site of 70 percent of the world’s internet traffic in 2019. To cool, the Cloud burns carbon, what Jeffrey Moro calls an “elemental irony.” In most data centers today, cooling accounts for greater than 40 percent of electricity usage.

While some of the most advanced, “hyperscale” data centers, like those maintained by Google, Facebook, and Amazon, have pledged to transition their sites to carbon-neutral via carbon offsetting and investment in renewable energy infrastructures like wind and solar, many of the smaller-scale data centers that I observed lack the resources and capital to pursue similar sustainability initiatives. Smaller-scale, traditional data centers have often been set up within older buildings that are not optimized for ever-changing power, cooling, and data storage capacity needs. Since the emergence of hyperscale facilities, many companies, universities, and others who operate their own small-scale data centers have begun to transfer their data to hyperscalers or cloud colocation facilities, citing energy cost reductions.

According to a Lawrence Berkeley National Laboratory report, if the entire Cloud shifted to hyperscale facilities, energy usage might drop as much as 25 percent. Without any regulatory body or agency to incentivize or enforce such a shift in our infrastructural configuration, there are other solutions that have been proposed to curb the Cloud’s carbon problem. Some have proposed relocating data centers to Nordic countries like Iceland or Sweden, in a bid to utilize ambient, cool air to minimize carbon footprint, a technique called “free cooling.” However, network signal latency issues make this dream of a haven for green data centers largely untenable to meet the computing and data storage demands of the wider world.

As a result, the Cloud now has a greater carbon footprint than the airline industry. A single data center can consume the equivalent electricity of 50,000 homes. At 200 terawatt hours (TWh) annually, data centers collectively devour more energy than some nation-states. Today, the electricity utilized by data centers accounts for 0.3 percent of overall carbon emissions, and if we extend our accounting to include networked devices like laptops, smartphones, and tablets, the total shifts to 2 percent of global carbon emissions.

Why so much energy? Beyond cooling, the energy requirements of data centers are vast. To meet the pledge to customers that their data and cloud services will be available anytime, anywhere, data centers are designed to be hyper-redundant: If one system fails, another is ready to take its place at a moment’s notice, to prevent a disruption in user experiences. Like Tom’s air conditioners idling in a low-power state, ready to rev up when things get too hot, the data center is a Russian doll of redundancies: redundant power systems like diesel generators, redundant servers ready to take over computational processes should others become unexpectedly unavailable, and so forth. In some cases, only 6 to 12 percent of energy consumed is devoted to active computational processes. The remainder is allocated to cooling and maintaining chains upon chains of redundant fail-safes to prevent costly downtime.

Please click on this link to read the full article.

Image credit: Image by kjpargeter on Freepik

Your account