I&IC Workshop #4 with ALICE at EPFL-ECAL Lab, brief: “Inhabiting the Cloud(s)”

Note: we will start a new I&IC workshop in two weeks (02-06.02) that will be led by the architects of ALICE laboratory (EPFL), under the direction of Prof. Dieter Dietz, doctoral assistant Thomas Favre-Bulle, architect scientist-lecturer Caroline Dionne and architect studio director Rudi Nieveen. During this workshop, we will mainly investigate the territorial dimension(s) of the cloud, so as distributed “domestic” scenarios that will develop symbiosis between small decentralized personal data centers and the act of inhabiting. We will also look toward a possible urban dimension for these data centers. The workshop is open to master and bachelor students of architecture (EPFL), on a voluntary basis (it is not part of the cursus).

A second workshop will also be organized by ALICE during the same week on a related topic (see the downloadable pdf below). Both workshops will take place at the EPFL-ECAL Lab.

I introduce below the brief that has been distributed to the students by ALICE.

 

Inhabiting the Cloud(s)

IMG_9021_m

Wondering about interaction design, architecture and the virtual? Wish to improve your reactivity and design skills?

Cloud interfaces are now part of our daily experience: we use them as storage space for our music, our work, our contacts, and so on. Clouds are intangible, virtual “spaces” and yet, their efficacy relies on humongous data-centres located in remote areas and subjected to strict spatial configurations, climate conditions and access control.
Inhabiting the cloud(s) is a five days exploratory workshop on the theme of cloud interfacing, data-centres and their architectural, urban and territorial manifestations.
Working from the scale of the “shelter” and the (digital) “cabinet”, projects will address issues of inhabited social space, virtualization and urban practices. Cloud(s) and their potential materialization(s) will be explored through “on the spot” models, drawings and 3D printing. The aim is to produce a series of prototypes and user-centered scenarios.

Participation is free and open to all SAR students.

ATTENTION: Places are limited to 10, register now!
Info and registration: caroline.dionne@epfl.ch & thomas.favre-bulle@epfl.ch
www.iiclouds.org

-

Download the two briefs (Inhabiting the Cloud(s) & Montreux Jazz Pavilion)

 

Laboratory profile

The key hypothesis of ALICE’s research and teaching activities places space within the focus of human and technological processes. Can the complex ties between human societies, technology and the environment become tangible once translated into spatial parameters? How can these be reflected in a synthetic design process? ALICE strives for collective, open processes and non-deterministic design methodologies, driven by the will to integrate analytical, data based approaches and design thinking into actual project proposals and holistic scenarios.

 

http://alice.epfl.ch/

 

Clog (2012). Data Space

IMG_9022

 

Note: we mentioned this “bookazine”, Clog, in our bibliography (Clog, (2012). Data Space, Clog online), at the very early stages of our design-research project. It is undoubtedly one of the key references for this project, mostly related to thinking, territory, space and therefore rather oriented toward the architecture field. It will certainly serve in the context of our workshop with the architects (in collaboration with ALICE) next week, but not only, as it states some important stakes related to data in general. This very good and inspiring magazine is driven by a pool of editors that are Kyle May (editor in chief, we invited him as a jury member when we –fabric | ch with Tsinghua University– organized a call during 2013 Lisbon Architecture Triennale, curtated by Beatrice Galilee), Julia van den Hout, Jacob Reidel, Archie Lee Coates, Jeff Franklin.

The edition is unfortunately sold out. Reason why I assembled several images from the bookazine (for the research sake) in a pdf that can be downloaded here (60mb).

 

From the editorial (May 2012):

“Over two billion people across the world use the Internet regularly. Every second, 2.8 million emails are sent, 30’000 phrases are Googled, and 600 updates are tweeted. While being absorbed into this virtual world, we most rarely consider the physical ramifications of this data. All over the world, data centers are becoming integral components of our twenty-first century infrastructure. These facilities can range from small portable modules to massive warehouses full of servers – from sleek new constructions to reuse of existing infrastructures. What is the significance of this bridge between the virtual and the physical? How does this new typology affect the discourse of architecture and teh shaping of our built environment? As cloud storage and global Internet usage increase, it’s time to talk about the physical space of data.”

Donaghy, R. (2011). Co-opting the Cloud: An Architectural Hack of Data Infrastructure. Graduate thesis work.

Part of our bibliography (among different works by architects –K. Varnelis– or about the Internet infrastructure –T. Arnall, A. Blum–) and published in Clog (2012), this thesis work by R. Donaghy presents an interesting hack of the data center infrastructure (centered on the hardware and mostly on the object “data center” in this case).

The work is digital published online on ISUU and can be accessed here (p. 134-150).

Reblog > Power, Pollution and the Internet

Via The New York Times (via Computed·By)

—–

power_cloud

SANTA CLARA, Calif. — Jeff Rothschild’s machines at Facebook had a problem he knew he had to solve immediately. They were about to melt.

 

The company had been packing a 40-by-60-foot rental space here with racks of computer servers that were needed to store and process information from members’ accounts. The electricity pouring into the computers was overheating Ethernet sockets and other crucial components.

Thinking fast, Mr. Rothschild, the company’s engineering chief, took some employees on an expedition to buy every fan they could find — “We cleaned out all of the Walgreens in the area,” he said — to blast cool air at the equipment and prevent the Web site from going down.

That was in early 2006, when Facebook had a quaint 10 million or so users and the one main server site. Today, the information generated by nearly one billion people requires outsize versions of these facilities, called data centers, with rows and rows of servers spread over hundreds of thousands of square feet, and all with industrial cooling systems.

They are a mere fraction of the tens of thousands of data centers that now exist to support the overall explosion of digital information. Stupendous amounts of data are set in motion each day as, with an innocuous click or tap, people download movies on iTunes, check credit card balances through Visa’s Web site, send Yahoo e-mail with files attached, buy products on Amazon, post on Twitter or read newspapers online.

A yearlong examination by The New York Times has revealed that this foundation of the information industry is sharply at odds with its image of sleek efficiency and environmental friendliness.

Most data centers, by design, consume vast amounts of energy in an incongruously wasteful manner, interviews and documents show. Online companies typically run their facilities at maximum capacity around the clock, whatever the demand. As a result, data centers can waste 90 percent or more of the electricity they pull off the grid, The Times found.

To guard against a power failure, they further rely on banks of generators that emit diesel exhaust. The pollution from data centers has increasingly been cited by the authorities for violating clean air regulations, documents show. In Silicon Valley, many data centers appear on the state government’s Toxic Air Contaminant Inventory, a roster of the area’s top stationary diesel polluters.

Worldwide, the digital warehouses use about 30 billion watts of electricity, roughly equivalent to the output of 30 nuclear power plants, according to estimates industry experts compiled for The Times. Data centers in the United States account for one-quarter to one-third of that load, the estimates show.

“It’s staggering for most people, even people in the industry, to understand the numbers, the sheer size of these systems,” said Peter Gross, who helped design hundreds of data centers. “A single data center can take more power than a medium-size town.”

Energy efficiency varies widely from company to company. But at the request of The Times, the consulting firm McKinsey & Company analyzed energy use by data centers and found that, on average, they were using only 6 percent to 12 percent of the electricity powering their servers to perform computations. The rest was essentially used to keep servers idling and ready in case of a surge in activity that could slow or crash their operations.

A server is a sort of bulked-up desktop computer, minus a screen and keyboard, that contains chips to process data. The study sampled about 20,000 servers in about 70 large data centers spanning the commercial gamut: drug companies, military contractors, banks, media companies and government agencies.

“This is an industry dirty secret, and no one wants to be the first to say mea culpa,” said a senior industry executive who asked not to be identified to protect his company’s reputation. “If we were a manufacturing industry, we’d be out of business straightaway.”

These physical realities of data are far from the mythology of the Internet: where lives are lived in the “virtual” world and all manner of memory is stored in “the cloud.”

The inefficient use of power is largely driven by a symbiotic relationship between users who demand an instantaneous response to the click of a mouse and companies that put their business at risk if they fail to meet that expectation.

Even running electricity at full throttle has not been enough to satisfy the industry. In addition to generators, most large data centers contain banks of huge, spinning flywheels or thousands of lead-acid batteries — many of them similar to automobile batteries — to power the computers in case of a grid failure as brief as a few hundredths of a second, an interruption that could crash the servers.

“It’s a waste,” said Dennis P. Symanski, a senior researcher at the Electric Power Research Institute, a nonprofit industry group. “It’s too many insurance policies.”

The full artcile@NY Times

Mejias, U. A. (2013). Off the Network, The University of Minnesota Press.

IMG_9024

 

Note: Off the Network, a book by Ulises Ali Mejias that is interesting to read when it comes to objectify and question the network paradigm. Beyond the praise about participation and inclusiveness that was widely used by network advocates and now also by marketing companies, Off the Network brings a critical voice and addresses the centralization, or in some other cases the “nodocentrism” that is at work through many global online services, so as the commodification of many aspects of our lives that comes with them.

While we are looking for alternative “architectures” for cloud infrastructure, nodes and services, this is a “dissonant” point of view to take into account and a book that we are integrating into the I&IC bibliography.

 

From the books’ blurb:

Off the Network shows us that centralization of online services is not accidental. Take a look behind the social media noise and read how algorithms condition us. Ulises Ali Mejias carves out a postaffirmative theory of networks. No more debates about wether you are a dog or not; identity is over. Power returns to the center of Internet debates. Off the Network disrupts the illusion os seamless participation – it slides with the resisters and rejecters and teaches us to unthink the network logic. Its message: don’t take the network paradigm for granted.”

 

A few images for a brief history (of computing) time

Note: a aet of images I’ll need for a short introduction to the project.

From mainframe computers to cloud computing, through personal computer and all other initiatives that tried to curve “the way computers go” and provide access to tools to people who were not necessarily computer savvy. Under this perspective (“access to tools”), cloud computing is a new paradigm that takes us away from that of the personal computer. To the point that it brings us back to the time of the mainframe computer (no-access or difficult access to tools)?

If we consider how far the personal computer, combined with the more recent Internet, has changed the ways we interact, work and live, we should certainly pay attention to this new paradigm that is coming and probably work hard to make it “accessible”.

 

00_IBM

A very  short history pdf in 14 images.