TOWARD RESPONSIVE ARCHITECTURES
Philip Beesley, Sachiko Hirosue, and Jim Ruxton
This book is about responsive architectures. The project is an exploration of the
interconnectedness of what surrounds us. The focus of this collection is on a new
generation of interactive systems within science, art and architecture that are
based on constantly evolving relationships. Using a wide definition of architecture
that includes both built and natural realms, we examine dynamic systems and environments of scales from molecules to cities.
We want to pose the question ‘What does responsiveness mean?’ ‘Responsive’
is used throughout this book to speak of how natural and artificial systems can
interact and adapt. Speaking of evolution, we might think of how environments act via natural selection on diverse populations. While that traditional definition is
included here, we also want to include conscious action. Responsiveness implies sensitivity. But stability and isolation – as we see it the opposite of sensitivity – are often seen as necessary for analysis of complex systems.1 In traditional scientific method, sensitivity and exposure to the surroundings can be thought of as disruptive ‘input’ that interferes with traditional working methods. The impulse to create closed systems is not exclusive to science: we could say it runs wherever we hear opposing terms used to describe complex situations: subject/object, self/other, form/f unction, organic/inorganic, observer/observed, static/dynamic. In the papers of this book we observe art, technology and design dissolving many of these artificial distinctions.
A host of new working methods allow these boundaries to be opened. We want to find strategies for thriving in complex interconnected ecosystems. Nature continues to inspire us: for many of the papers in this collection nature is the fundamental teacher. Biological systems show molecular self-assembly and self-sustainability and serve as a model of the miniature mechanical parts that nanotechnology promises. 2 Organisms at every scale contain networks consisting of multiple parts that operate far outside of thermodynamic equilibrium. Examples of these complex feedback mechanisms are found in modern electronic control systems. This ‘imbalance’ creates a kind of charged state of readiness that can resolve perturbations and damage.
The projects within this book transform the environment, and many draw upon the highest technology and economies available in the world. Mid-way
through the past century, the American engineer Buckminster Fuller said:
“…man Is just about to begin to participate consciously and somewhat more knowingly and responsibly in his own evolutionary transformation. I include evolution of the environment as a major part of the evolution of humanity.”
But Fuller’s confidence stands jn contrast to a cultural anxiety that has accompanied waves of technological advances since the Industrial Revolution. We now routinely embed devices into our surroundings that are triggered by our actions. ‘Intelligent’ building systems now turn on lights, lock and unlock doors and adjust heat. Data containing radio frequency identification tags are increasingly standard devices attached to consumer items for point-of-purchase accounting and theft deterrence. Along with the proliferation of sensing devices comes the reality that we will be sensed everywhere we go. Who is watching?
“[T]he environment touches man where it hurts…” said Reyner Banham,
the visionary British designer.6 Banham was speaking metaphorically, but biology confirms it is true: the soft tissues and hormonal systems immediately affected
by environmental stress are closely related to the neurophysiology of emotion and pain.7 What does it mean to create a responsive world today? We hesitate.
Interconnectedness in Molecular Detail
At the beginning of the 20th century, alongside the fateful discoveries that resulted in the nuclear weapons of World War Two, chemists and physicists became interested in biology. The new synthesis of disciplines led to the discovery of the double-helix structure of Deoxyribonucleic Acid: DNA. That insight enabled manipulation of biological structure and function at the scale of molecules. The maturing field of molecular biology has again involved repeated flirtations of biology with other disciplines, encouraging a systems perspective of molecular knowledge of organisms.”
In parallel, building upon late-nineteenth century zoology, D’Arcy Wentworth
Thompson’s pioneering text On Growth and Form demonstrated that the physical
forms of organisms can be understood as ‘diagrams offerees’ that trace physical
influences within the environment over long time periods. Adaptation to the environment through intimate linkages of natural forms and functions has now been described in mathematical detail.
Another watershed moment was the Human Genome Project,the project of compiling the complete genetic sequence of the human organism. Large arrays of experiments were processed at the same time, requiring interdisciplinary teams with specialists from robotics, quantitative image analysis, chemistry, biology, and material science. This cooperative project required processing in massive numbers, including observations of hundreds of changes in activity within a cell on a single chip.
Looking at multiple processes encouraged moving beyond the concept
that single genes are responsible for specific traits.The relationship between
genotype and phenotype has been traditionally thought of as ’cause and effect’ where genes act as a blueprint for life. A genotype is a group of organisms that share a similar genetic makeup. A phenotype is the visible characteristics of an organism resulting from the interaction between its genetic makeup and the environment. However, it is increasingly clear that the relationship is by no means a one-way street. Organisms are influenced by their environments by selection acting on phenotypes, not on genes.With the advent of the chip, we can now study how groups of genes act in concert. A convergence of dynamic ‘network’ thinking from information technology and computer science has contributed to this understanding. The boundary between environment and organism is indeed blurred.
In full circle from D’Arcy Wentworth Thompson’s research, microscopic observations have shown that cell shapes are dictated by three dimensional skeletons that mirror large-scale architectural space-frames. Cellular shape has been directly linked to the processes of chemical signaling, gene regulation, and development, demonstrating that form and function are intimately linked at the molecular level. New approaches to three-dimensional cell culturing systems have been developed to serve stem cell research and tissue engineering. These culturing systems in turn reduce the need for animal experiments. New developments in materials compatible with physiology, and miniature fabrication methods similar to those used for manufacturing computer chips have contributed to this progress.
The quantitative study of complex biological systems is a four-dimensional
problem that includes the critical dimension of time. To effectively study the multiple dynamic processes that occur in cells and organisms, new approaches are needed. Analysis tools that support visualization and analysis in space and time are required and specimens need to be alive. Familiar medical technologies such as Magnetic Resonance Imaging and Positron Emission Tomography have been miniaturized to permit analysis of molecules and cells in living animals. Examples of new analysis equipment include high speed microscopy featuring shutter speed timed in nano-seconds, and methods that allow ‘seeing’ deeper than a single cell layer in near live conditions. These techniques are supported by an expanding palette of ‘marker’ molecules that can label a specimen without interfering with its original function. Marking materials include proteins derived from jellyfish, quantum dots, and super paramagnetic iron oxide.
These materials permit the observation of single and grouped molecules.Optical tweezers and atomic force microscopy allow probing and manipulating at microscopic and atomic scales. This ability to probe means that mechanical properties can be measured alongside observations of spatial and chemical dynamics. The two-way street of evolutionary development often plays itself out through molecular exchanges that can be detected by using these tools. The kind of data collected in this research draws from a cluster of related disciplines, including computational algorithms and quantitative analyses from applied mathematics.
Using terms of reference derived from structural engineering of buildings, Donald Ingber proposed that cells contain ‘tensegrity’ structures. He suggested that they are organized as triangulated three-dimensional geodesic skeletons akin to Buckminster Fuller’s revolutionary dome architecture from the past century.The new tools demonstrate that these skeletal elements indeed distribute and sustain their own weight.
Molecular level biology is now poised to work with critical questions of shapes and structures at the scale of atoms, cells and organisms. By manipulating shape and structure of organisms, fundamental relationships with their communities and microenvironments are altered. It does not stop there. In the same manner, we are able to approach how organisms respond to ‘macroenvironmental’ factors that span the scale of the galaxy, including geomagnetic and gravitational forces. Think about circadian ‘clocks’ that guide our own responses to the cycle of night and day, or the navigational instincts that are transmitted through generations in migrating birds and insects.
The confluence of disciplines has created an effective research environment for considering multiple scales and dimensions in the natural world. In turn, the natural world is starting to be revealed in molecular detail as a dynamic ecology of interconnectedness.
Building Responsiveness
A wave of new industrial processes is transforming building design and construction. The next generation of architecture will be able to sense, change and transform itself. The tools and materials discussed here make this kind of responsive architecture possible.
Rigidity and resistance to the external environment are normal qualities in
building. Traditional buildings use components of construction fabricated in a strict order. For example, a foundation and structural core in concrete might formthe basis for steel columns supporting floor plates, and on these a grid of windows may be hung. The first stages of construction normally form an immovable and stable base that supports the entire assembly of building components. But new generations of buildings do not rely on completely stable foundations. Rather than relying on centralized support, they are designed to accommodate constantly shifting forces. These new systems tend to distribute their loads throughout interlinking structures that can withstand changes and deformations.
New architectural projects discussed in this book explore structural systems
based on tensile and ‘tensegrity’ systems in which stretching and pulling forces can play throughout a structure. These hybrid structures are accompanied by design methods where complex relationships can be analyzed and refined, and by a fresh palette of building elements made possible by computer-controlled prototyping and manufacturing. New fibers used in architecture include composites of glass and carbon that are stronger, more agile, and more energetically efficient than traditional steel and glass assemblies. This kind of building involves new methods of construction using continuous chains of components and distributed structures.
Building Information Modeling (BIM) is a process where three-dimensional forms, engineering systems and component specifications are integrated within massive arrays of information. Similar to the fundamental implications of the Human Genome project, BIMs now have formidable influence on architecture. Systematically coded and organized components can be custom-made off-site as a building assembly kit, assembled, and then managed through the life of the building.
Computer-aided design is capturing the geometric relationships that form the foundation of architecture. Finite Element Analysis is a method that breaks down a continuous structure into many simple, linked elements. This allows formerly unthinkable forms to be assessed for mechanical, material, and energy requirements and to be realized as a built structure. Form-finding software supports analysis of freeform structures in order to find optimal thicknesses and arrangements of supporting elements. The practice of form-finding is often enhanced by the practice of biomimicry, design methods that follow principles from nature.
Parametric modeling is a new approach that allows designers to control variables of the design through models that can coordinate and update themselves.These systems can automatically update the entire model or drawing set based on changes as small as a Joint or as large as the entire floor plan. New research concepts show how parametric systems can support exploring of complex multiple alternatives. Software tools such as Bentley Systems’ Generative Components offer flexible design of deeply nested relationships. They accomplish this by organizing ‘dependency’ networks akin to the complex process diagrams used to express relationships in natural systems. In much the same way that a mutating virus can generate biodiversity, individual variation can be achieved economically. Multiple variations can be created by manipulating digital code to create detailed individual sets of instructions for manufacturing. The cost of making one thousand identical parts and one thousand individual parts with slight variations can be almost the same. The building design industry is in the very early stages of adoption of these tools.
Computer assisted design-to-fabrication methods are transforming what we can make. Custom cutting, shaping, and depositing tools invite new forms. Versatile modular construction systems that allow integration of diverse parts are made possible by direct-manufactured systems. Digital fabrication allows a designer to work closely with industrial production in this process. Perhaps the biggest impacts of this technology are being felt in the massive economies of traditional steel, wood and concrete construction, where automation and prefabrication have transformed the industry. The wasteful practices of solid-timber framing are increasingly a thing of the past, replaced by stranded and laminated composites that can employ almost every part of timber harvested from managed forests. Direct-manufactured steel systems allow coded and organized components to be custom-made off-site as a building assembly kit. Similarly, custom formed concrete is now possible, no longer the exclusive province of lavish budgets.
Numerically controlled fabrication machinery allows the production of prefabricated
formwork for relatively economical freeform cast construction.
This increasingly fine-tuned approach to building component design and the flexibility and movement achievable in new building systems changes the fundamental behaviour of buildings. Architecture can now be operated as an instrument. Composite building structures now incorporate sensors, displays, and a range of mechanical functions much like what outfits a car today. Many of our actions activate automatic responses in our environment. Buildings contain a myriad of sensors that detect temperature, humidity, light, fire and many other parameters relevant to the operation of the facility and the safety and comfort of their occupants. Modern public toilets have a number of sensing devices for our convenience. There are motion detectors that turn on tights and hand dryers, and distance sensors that determine appropriate times to flush.
The proliferation of sensing devices means that we can be sensed everywhere we go. Radio Frequency Identification technology will soon replace bar codes on consumer goods. Yet unlike bar codes, these radio broadcasts also follow and identify us at home. Who should have access to all this data? The questions quickly become personal: if I am detected doing something private, do I have the right not to let other people know? Who holds the controls? The consequences of this new wave of ‘making’ are not simple.
Personal Scale
Responsive systems from the point of view of an artist conjure up a world rich in both possibilities and poignant issues. Sensing devices are becoming ubiquitous. Interactive systems using sensing devices are now available as part of an artist’s palette. The manufacture of these sensing devices for high volume commercial use has provided access to artists who want to create interactive systems responding to movement, light, touch, heat, acceleration, and position. Because these devices are increasingly inexpensive, it becomes possible to use them in open-ended experiments. In turn, this can invite users to probe the public and commercial implications of these systems.
The proliferation of consumer-level ‘gaming’ computers has funded the engineering of highly efficient processors and large memory capacity, supporting manipulation of video and audio signals in real time. By interfacing sensors with computer programs artists, are able to create complex real time responsive systems that include audio and video effects. An example of an interactive system for dance is Isadora developed by the American media artist Mark Coniglio, of Troika Ranch, a dance company that presents media-rich performances. This program offers a graphic interface that allows easy programming and manipulation of video and audio compositions. Using sensors or cameras, physical action can be used as a control variable. Coniglio designed this system to be used in a performance environment. The recent performance by Troika Ranch, 16 [R]evolutions revealed the versatility of the system, which effectively makes the interactive system a kind of ‘performer’ acting in parallel with human dancers.
Many of the interactive systems currently available are the result of an artist developing software for their own use and then making it available to others. Toronto artist David Rokeby’s Very Nervous System software provides a way for artists to achieve inexpensive motion tracking using a video camera for mapping physical movement. The system is often used in dance performances where, for example, the upper body can be mapped to activate one set of sounds white the lower part of the body might activate other sounds. An entire space can be made responsive by programming sound and video to play in response to signals collected from different locations or zones.
Eyes Web, a software package developed in the InfoMus Information
Laboratory at the University of Genoa, offers the artist a sophisticated tool for analysis of physical gestures. Film production houses use motion tracking systems such as Polhemus and Flock of Birds that allow a point by point mapping from actor movements to a virtual character, yielding the realistic movements seen in popular cinema today. These devices work by measuring changes in an electromagnetic field as sensors move through space. Toronto based sound artist Darren Copeland is currently experimenting with the Polhemus system as an interface for a multi-channel sound diffusion system, showing the breadth of applications in which these sensor systems can be used. By moving sensors through space, Copeland is able to control a multi-channel audio environment.
This kind of software can provide direct relationships between stimuli and actions, and it can also ‘participate’ making decisions and taking random steps that add complexity to the composition. Functions can be added into the software yield life-like effects that simulate natural movement. For example, by including rules from natural physics in modeling software, movement that imitates the interactions of physical bodies moving within gravity can be simulated. ‘This’ processing can add sensual qualities to animations within virtual performance space. These qualities can also be employed in feedback loops where automated •outputs’ are fed back into the system as new -input’, producing complex and subtle results. A particularly interesting development is in the exploration of rarely-tapped dimensions such as proprioception, the sense of the body’s position with respect to itself in space.
Wireless networking and low-cost home systems that adapt existing building power circuits allow development of interactive systems that can communicate over substantial distances. In the last decade, artists have had access to small receiver-transmitter pairs that operate within an unlicensed Industrial Scientific and Medical wireless band. The recent introduction of Bluetooth and Zigbee technologies has given increased flexibility to wireless networking by allowing nodes to ‘talk’ to each other in networked configurations, opening new possibility for remote operation.
Networked compositions can involve subtle exchanges. The Toronto work “Heavy Breathing” allowed two participants in different locations to digitize their breath and send it back and forth by breathing into an apparatus while a fan recreated the transmitted breath. The recent Ku:iyashikei-net by Urico Fujii and Ann Poochareon allowed the transmission of tears over the internet.
What makes these mediated experiences so attractive? Interactive installations offer expanded powers: a small movement can be programmed to produce a world of sounds. Interactive systems can allow a performer to take control of tight, sound and video within their environment. No longer reliant on sound or lighting cues, performers can find spontaneity in their actions. However, the experience likely goes far beyond ‘power’. When someone enters an interactive installation, the immediate response to their presence can yield a powerful sense of personal connection. Artists have reacted to the proliferation of virtual meeting places and the accompanying loss of physical touch by exploring new ways of transmitting intimacy over a network. In today’s mediated society, ‘touch’ has complex implications.
What is it that drives us to create ‘responsive architectures’? Perhaps it is a sense of empowerment and involvement that drives interactive technologies forward. Is it because as a society we are becoming more cerebral that we crave increased movement around us? We rely less and less on our bodies. While children previously spent much of their time running and jumping, they now spend more time making avatars run and jump on a screen with a flick of their lingers. Creating more efficient structures and machines will further reduce the necessity of the human body. At the same time, this increasingly cerebral culture provides increased capacity for understanding how human bodies work. The study of nature reveals an interconnected set of mechanisms guided by structural and chemical ‘intelligence’. These systems are a potent model for how we can impart sensuality and kinesthetics in buildings and machines. The importance of these’
qualities seems to increase as our physical bodies fade.
Seen in this way, the receding function of an original human body forms a poignant equation of loss and gain. Lost: the corporeal sensation and connections between bodies. Gained: a redefined ‘body’ whose expanded border embraces the surrounding environment.
The pursuit might be toward the sublime. Perhaps the Sense is taking to the end of a very long period of loneliness, or a sense of returning home after an extremely long journey.