Reblog > The cloud

Note: after a Summer “pause” mostly dedicated to a mid-term exhibition and publication of our joint design-research, we are preparing to dive again into the I&IC project for 18 months. This will first happen next November through a couple of new workshops with guest designers/partners (Dev Joshi from Random International at ECAL and Sascha Pohflepp at HEAD). We will take the occasion to further test different approaches and ideas about “The cloud”. We will then move into following steps of our work, focused around the development of a few “counter-propositional” artifacts.

But before diving again, I take the occasion to reblog an article by James Bridle published earlier this year and that could act as an excellent reminder of our initial questions, so as a good way to relaunch our research. Interestingly, Brindle focuses on the infrastructural aspects of the cloud (mostly pointing out the “hard” parts of it), which may in fact become the main focus of our research as well in this second phase. Scaled down certainly…

 

Via Icon (thanks Lucien for the reference)

—–

cloud_copy

 

There’s something comforting about the wispy metaphor for the network that underpins most aspects of our daily lives. It’s easy to forget the reality of its vast physical infrastructure

Heating homes with Clouds – links

heat_post2

Using excess heat generated by data centers to warm homes isn’t a new idea. Earlier in our research we stumbled upon Qarnot, a french company proposing to decentralize the data center into meshed radiators to distribute computing resources across people’s homes (we’re guessing they took their name from Carnot’s Limit ;). They announced a partnership with the city of Paris to heat 350 low-income housings in 2013.

However, they are not the only rats in the race…

About hot and cold air flows (in data centers)

hot-aisle_cold-aisle_01

hot-aisle_cold-aisle_conditions

Both images taken from the website Green Data Center Design and Management /  “Data Center Design Consideration: Cooling” (03.2015). Source: http://macc.umich.edu.

ASHRAE is a “global society advancing human well-being through sustainable technology for the built environment”.

 

A typical question that arise with data centers is the need to cool down the overheating servers they contain. The more they will compute, the more they’ll heat, consume energy, but also will therefore be in need to be cooled down, so to stay in operation (wide range of operation would be between 10°C – 30°C). While the optimal server room temperature seem to be around 20-21°C, ~27°C for recent and professional machines (Google recommends 26.7°C).

The exact temperature of function is subject to discussion and depends on the hardware.

Yet, in every data center comes the question of air conditioning and air flow. In this case, it always revolves around the upper drawing (variations around this organization): 1° cold air aisles, floors or areas need to be created or maintained, where the servers will take their refreshing fluid and 2° hot air aisles, ceilings or areas need to be managed where the heated air will need to be released and extracted.

Second drawing shows that humidity is important as well depending on heat.

 

As hot air, inflated and lighter, naturally moves up while cold air goes down, many interesting and possibly natural air streams could be imagined around this air configuration …

 

Reblog > Power, Pollution and the Internet

Via The New York Times (via Computed·By)

—–

power_cloud

SANTA CLARA, Calif. — Jeff Rothschild’s machines at Facebook had a problem he knew he had to solve immediately. They were about to melt.

 

The company had been packing a 40-by-60-foot rental space here with racks of computer servers that were needed to store and process information from members’ accounts. The electricity pouring into the computers was overheating Ethernet sockets and other crucial components.

Thinking fast, Mr. Rothschild, the company’s engineering chief, took some employees on an expedition to buy every fan they could find — “We cleaned out all of the Walgreens in the area,” he said — to blast cool air at the equipment and prevent the Web site from going down.

That was in early 2006, when Facebook had a quaint 10 million or so users and the one main server site. Today, the information generated by nearly one billion people requires outsize versions of these facilities, called data centers, with rows and rows of servers spread over hundreds of thousands of square feet, and all with industrial cooling systems.

Reblog > Setting up a Raspberry Pi to run bots

Artist Jeff Thompson has put this comprehensive tutorial on how to run bots on a Raspberry-Pi microcomputer – including the basics of setting up the Pi to run without a screen and programming it remotely by SSH-ing into it from another computer. This is an interesting way to tap into small ressources of the cloud without necessarily consuming vast quantities of energy.

–-

Via Algopop

 

Reblog > Into the Cloud (with zombies)

By Wednesday, September 17, 2014 Tags: 0030, Data, Datacenter, Designers, Energy Permalink 0

Note: Published almost two years ago, this interesting post by Kazys Varnelis

—–

Via varnelis.net

Today’s New York Times carries a front-page piece by James Glanz on the massive energy waste and pollution produced by data centers. The lovely cloud that we’ve all been seeing icons for lately, turns out is not made of data, but rather of smog.

The basics here aren’t very new. Already six years ago, we heard the apocryphal story of a Second Life avatar consuming as much energy as the average Brazilian. That data centers consume huge amounts of energy and contribute to pollution is well known.

On the other hand, Glanz does make a few critical observations. First, much of this energy use and pollution comes from our need to have data instantly accessible. Underscoring this, the article ends with the following quote:

“That’s what’s driving that massive growth — the end-user expectation of anything, anytime, anywhere,” said David Cappuccio, a managing vice president and chief of research at Gartner, the technology research firm. “We’re what’s causing the problem.”

Second, much of this data is rarely, if ever used, residing on unused, “zombie” servers. Back to our Second Life avatars, like many of my readers, I created a few avatars a half decade ago and haven’t been back since. Do these avatars continue consuming energy, making Second Life an Internet version of the Zombie Apocalypse?

 

 

So the ideology of automobliity—that freedom consists of the ability to go anywhere at anytime—is now reborn, in zombie form, on the Net. Of course it also exists in terms of global travel. I’ve previously mentioned the incongruity between individuals proudly declaring that they live in the city so they don’t drive yet bragging about how much they fly.

For the 5% or so that comprise world’s jet-setting, cloud-dwelling élite, gratification is as much the rule as it ever was for the much-condemned postwar suburbanites, only now it has to be instantaneous and has to demonstrate their ever-more total power. To mix my pop culture references, perhaps that is the lesson we can take away from Mad Men. As Don Draper moves from the suburb to the city, his life loses its trappings of familial responsibility, damaged and conflicted though they may have been, in favor of a designed lifestyle, unbridled sexuality, and his position at a creative workplace. Ever upwards with gratification, ever downwards with responsibility, ever upwards with existential risk.

Survival depends on us ditching this model once and for all.