Call for Papers

100th Annual Meeting - Digital Aptitudes
 
ACSA 100th Annual Meeting 
March 1-4, 2012 in Boston, MA
Host School: Massachusetts Institute of Technology
Co-chairs: Mark Goulthorpe, Massachusetts Institute of Technology; Amy Murphy, University of Southern California


Click here for complete submission requirements and deadlines.

*With the launch of our new website all members were provided with new passwords via email. If you did not receive your new log in information or cannot locate this email, click here to obtain your new password before submitting your paper. Thank you!


Open Sessions

In recognition of our 100th Anniversary, ACSA will be offering multiple thematic open sessions.

Community
Tom Fisher, University of Minnesota
 
Disaster Recovery
Reinhard Goethert, Massachusetts Institute of Technology
 
Diversity
Brian Kelly, University of Maryland
 
History/Theory
Vittoria Di Palma, Columbia University
 
Sustainable Design
Adrian Parr, University of Cincinnati
 
Urbanism
Tim Love, Northeastern University

Thematic Sessions

100_1: THE UTOPIC, DYSTOPIC, AND HETEROTOPIC HISTORIES OF THE 20th CENTURY TECHNOLOGY
1912: Progress, Technology, and Nature
     Fran Leadon, City College of New York


The ACSA was founded in 1912, a pivotal moment in the Progressive Era, when American architects were struggling to cast off nineteenth-century historicism while grappling with how best to represent emerging technologies. Both Cass Gilbert’s Woolworth Building and Warren & Wetmore’s Grand Central Terminal were nearing completion at the beginning of 1912, and both were essentially modern buildings cloaked in the forms of revivalism: the Woolworth Building, marketed as an opulent “vertical ocean liner,” was at heart a steel frame draped in Gothic Revival clothes, while Grand Central was a Roman Revival temple masking the largest matrix of railroads in the world.

1912 marked the ascent of Woodrow Wilson as the face of the Progressive movement, and when explorers reached the South Pole for the first time, in January of that year, the discovery seemed to foreshadow an era of limitless technological possibilities. Cities in particular became epicenters of Progressive ideals, exploding in both physical density and symbolic meaning, laden with the potential of urban-based technology: Boston, New York, Chicago, St. Louis, and San Francisco pulsated with electric arc streetlights, speeding automobiles, hurtling subways, and elevated railroads. At the beginning of 1912, technology seemed poised to dominate nature.

But nature won a knockout round in April 1912, when RMS Titanic, a technological wonder, symbol of progress, and last artifact of the flickering Gilded Age, collided with an iceberg and sank in the north Atlantic. Only two years later Europe descended into the unimaginable horrors of World War I, and faith in technology as a form imbued with historical precedent was lost forever. Modernism as a style began to take over from nineteenth-century revivalism as the “look” of technology, and architecture, in turn, began to take aesthetic cues from machines. Architecture as a profession emerged from World War I less interested in building “vertical ocean liners” and more interested in discovering what William Cronon refers to as second nature: design – whether of factories, housing, offices, bridges, dams, or transportation systems – so ingrained in its environment that it becomes a seemingly “natural” occurrence.

This session invites papers that look back at the watershed moments of the last 100 years, beginning in 1912, to explore the relationship between architecture, progress, technology, and nature. Paper topics might include technology’s influence on architectural style; shifting definitions of “progress” (from the construction of New Law tenements during the Progressive Era and demolition of those same tenements during the Urban Renewal years, to historic preservation, adaptive reuse, and today’s preoccupation with sustainability); how cataclysmic events (the sinking of the Titanic, the World Wars, the Great Depression, Vietnam, the Gulf Wars, September 11, Hurricane Katrina) have shaped architecture; and how architects have defined the “nature of nature” as both a technological and progressive issue (Wright’s pastoral ideal vs. Le Corbusier’s obsession with efficiency, for example). This session especially seeks papers that look back to the Progressive Era in order to speculate on how “progress” might be redefined in the years ahead.
  

100_2: THE EARLY CONSOLIDATION OF A DIGITAL DESIGN PARADIGM
1988–1997: Ambitions and Apprehensions of a “Digital Revolution”
      John Stuart, Florida International U.
      Sunil Bald, Yale University
 
While characterizations of decades to describe cultural and technological movements can seem arbitrary, they can also provide helpful starting points for further investigation. This session seeks papers that explore the period from 1988–1997, when architectural education and culture were undergoing a “digital revolution.” Roughly framed by MoMA’s Deconstructivist Architecture exhibition (1988) and the opening of Frank Gehry’s Guggenheim Museum in Bilbao (1997), the decade offers fertile ground for inquiries into changes in architectural education and culture during a time when formal paradigms, techniques of production and representation, and accepted historical narratives were in a condition of particular flux. The ‘90’s arguably demonstrate the productive and transformative potential of an unstable and transitional period.

This ten-year span saw the use of Maya software in the creation of The Abyss (1989), Terminator 2 (1991), and of Jurassic Park (1993). Recent architectural graduates who used these programs were in demand by film studios and impacted the aspirations of future students and shaped architectural pedagogy. If the films provided optimistic dreams of bright futures for computer animation, the First Gulf War (1990), trademarked by televised infrared night views and the images of buildings caught in the cross-hairs of computer-guided precision missiles, offered technologically enhanced nightmares of political and economic upheavals for decades to come.

Just as technology opened new vistas for investigation during this period, architectural history flourished through new understandings of the past as well as recontextualizations of contemporary phenomena. From the inclusion of political ideology (Mary McLeod’s important essay, “Architecture and Politics in the Reagan Era: From Postmodernism to Deconstruction,” (1989)) and constructions of identity (Beatrice Colomina’s Sexuality and Space (1994)), to an apolitical globalism (Rem Koolhaas and Bruce Mau’s S, M, L, XL (1998)), the world outside became very much a part of an expanded architectural discourse.

Two years after the construction in 1992 of Gehry’s Fish Sculpture in Barcelona, which was one of the first structures entirely designed and constructed using computer software, Bernard Tschumi introduced the “paperless” studios with forty-two networked workstations allotted to third-year graduate design studios. These completely digital design studios foreshadowed the introduction of Computer Aided Manufacturing into architectural education and paralleled the computerization of the profession and of building industries more generally. This shift allowed a dialogue to develop between those who were trained in the analogue educational system, and those—in many cases just a year or two later—who were trained in the digital design studios.

We seek papers that both highlight the challenges produced by the digital transformations of this period and question their impact on architectural practice, education, and theory. How did this shift toward the digital realm—entangled within political, economic, social, and cultural changes of its time—effect architectural pedagogy, community engagement, and approaches to form and the environment? We are looking for a geographic breadth of perspectives in an effort to gain new insights into the hopes and anxieties of this critical ten-year period in architectural history.
  

100_3: THE OPENING OF OTHER (N) DIMENSIONS
4D Architecture
      Keith Green, Clemson U.
 
Architects are said to have the unusual capacity to draw on a vast-range of information and material sources in creating architectural works responsive to local and global conditions. Today’s issues – the expanding knowledge economy, new technologies associated with computing and advanced materials, striking demographic changes, and unprecedented sprawl, among them – demand from architects responses beyond the two- and three-dimensional. While architecture is generally regarded as a static form, outside time, architecture is inseparable from the dynamic conditions that surround, permeate and construct it. Responsively, architecture today should emerge not as plans (2D) or envelopes (3D) but rather as inseparable webs of physical and social relationships operating within time (4D), and across scales (from that of the human hand to that of the metropolis).

4D architecture might be situated in a single place or it might be mobile; it might be integral to the building fabric or it might be something added or attached; it might employ new technical possibilities or it might recuperate established technical means; and it might just cultivate new models of human identity. In any case, 4D architecture – complex, indeterminate, and set within an indefinable, uncertain situation – promises to be a far-more appropriate response to today’s broader critical concerns associated with the built environment: accessibility, consumption, flexibility and production.

What does it mean to introduce time as a significant variable in design? How do we accommodate time and motion in design? How do we design an architecture that, itself, moves? What do time-based media offer design processes and outcomes? How are existing public and private spaces at various scales redefined with the conceptualization of architecture as 4D? What socio-cultural opportunities and challenges does designing in 4D present to inhabitants and the wider populous? And, does 4D architecture mark the end of “post-modernity”?

Paper submissions are sought which present either 4D design projects undertaken by the author(s) or critical perspectives on historical, visionary or contemporary 4D projects and practices. Topics of interest include those referencing new media, advanced and/or sustainable materials and systems, intelligent (responsive) environments, and other time-based practices and projects at all scales. Whatever the case, authors are encouraged to be highly-methodical, critical and reflective in their engagement of the topic of 4D architecture.

100_4: THE MATERIALS AND TECTONICS OFFERED BY COMPUTATION
Advanced Composite  Fabrication Technologies for Architecture
      Michael Silver, Mike Silver Architects
 
 Fiber reinforced plastics have been used in military aircraft since the early 1970’s. Today, these high-strength substitutes for aluminum and steel are making their way into the civilian market with ultra-light, fuel-efficient, carbon fiber designs such as the Boeing 787 Dreamliner and the Premier One business jet by Hawker Beechraft. While both planes incorporate the latest advancements in material science and engineering, only the Premier One, “features an all composite fuselage…built without an internal frame.” This radical break with tradition could not have been achieved without computer automated fiber placement technology, an entirely new process that is necessitating fundamental changes in the way we think about construction.

The system itself employs a robotically controlled taping head that lays down glass, Kevlar or quartz reinforced strips on a reusable mold. By controlling the placement and orientation of fibers in a single component engineers can manage the flow of mechanical forces as they change along the surface of a part subject to mechanical loads. (Fiber reinforced composites are on average 1.5 times stronger and 10 times lighter than cast-in-place concrete.) As replacements for structural components that are over designed and isotropic (I-beams, T-sections, etc.) robotically fabricated composites achieve their efficiency by increasing material densities where stresses are high and decreasing them where they are low. (Nonstandard taping patterns and multiple plies are produced without a subsequent increase in cost.) The complexity of the final component is produced for free by simply changing the data fed to the taping robot.

New composite fabrication processes not only point to a fundamental transformation of design, a ‘paradigm shift’, but they also facilitate practical changes in the relationship between ecology and use. With automated fiber placement technology there is no need to broker a compromise between structure and program because in the absence of an articulated frame, anisotropic membranes can be optimized to both without conflict or separation. This is possible because in a single area patterns, ply depths and tape widths can be manipulated independently with a high degree of precision. In other words the number of fiber layers placed in section can augment the structural deficiencies produced by functional and/or aesthetically driven fiber layouts in plan. The placement process can also manage opposing demands across a wide range of scales from the very small (1/8” minimum tape width) to the very large (a placement envelop as big as 1,400 SF)

Here we have a flexible, tectonically coherent system where windows, walls, doorways and supports are all forged from the same medium. This is a truly unprecedented development in the history of form, one that erases the traditional tension between art and engineering. (Today, building envelopes can be evolved to satisfy the requirements of both simultaneously.) Now at our disposal is a single, integrated building system that conserves materials, creates stronger, nearly weightless assemblies while expanding the potential for complexity in architecture.
 
 
100_5: THE SPACE AND NATURE OF DRAWING IN A DIGITAL MEDIUM
The Agency of Drawing and the Digital Process
      Andrew Atwood, U. of Southern California

Digital technology has offered us much in the way of design development and increased interactivity with our design environments. Can these same tools and processes be used to redirect and rethink the role of the drawing in architecture?

This session presupposes that: 
1. The drawing is fundamental to the discipline of architecture.
2. The drawing is under attack by the BIM, the screen shot and the algorithm.
3. The drawing has autonomy.
4. The drawing might be an object.

 
Beginning with these presuppositions —and this session is interested in those that agree and those that do not agree— what is the state of the drawing in contemporary architecture practice and design research? What is the fate of the drawing in a discourse filled with models and procedures but no drawings? What is the fate of architecture without drawings? Perhaps the most important question is, what is a drawing? And has the definition changed as a result of digital design process and practice?
 
This session seeks to explore the boundaries defining drawing in architecture. What other modes of working are possible if we are allowed to take a projective approach to the medium of drawing and allow ourselves to reinterpret, perhaps even redefine what a drawing is? What are different possibilities for the tools and techniques of drawing? We moved seamlessly from the pencil to the plotter but can we marry the two? Can technology be used to reconstitute the drawing? And if the drawing has agency, in what other directions might it take us?
 
I am worried. What will we talk about if we don’t have drawings?
 
 
100_6: DIGITAL GEO-POLITICS
At the Base of the Pyramid: Digital Design and Manufacturing for Extreme Affordability
       Mahesh Senagala, Ball State U.
 
Over 4,000,000,000 people live on less than 4 dollars a day. That is 68% of the world’s population. The advanced markets in the West consist of 0.75 billion people, or a whopping 2% of the world’s population. In a few more decades, world population is projected to cross 9,000,000,000 and a large portion of the people living at the Base of the Pyramid. While we often hear about only the social justice and moral arguments about the need to address the needs of the people at the “Base of the Pyramid,” C. K. Prahalad, Mohammad Yunus and others have convincingly demonstrated that the markets at the “Base of the Pyramid” are viable and dynamic markets full of social entrepreneurial potential. The need is there as well as the opportunity. Now we turn to the world of architecture.

Digital design, performance simulation and manufacturing technologies have been buzz words of late in the architectural circles, at least in the West. So far, much of the design and research in these areas within has been aimed at mature and resource-rich markets in highly developed countries that are at the top 2% of the world’s economy. Despite significant advances in manufacturing, business modeling, “reverse innovation” and logistics in general, architectural world remains largely aimed at producing one-off pieces in and for resource-rich markets. Affordability--let alone extreme affordability--remains largely missing from current discourses in architecture. Further, even within USA, there exists a need for extreme affordability in what is now being defined as the “fourth world.” As Vijay Govindarajan pointed out, “reverse innovation” (innovation in developing world for developing and developed world) has been gaining momentum, particularly in China and India. While there exist plenty of examples in the industrial and consumer product design domains, we are hard-pressed to find even a handful of examples of innovation for extreme affordability in architecture. Despite all the talk about mass customization, architectural designers seem to place more importance on the “customization” part while ignoring the “mass” part. Moreover, there is a paucity of research on strategies and tactics of how to reach billions of people in the developing world through innovation (not just invention or wild speculation) in all aspects of architecture.

The session invites papers addressing the questions of relevance, means, and methods. What are the technological and design problems of manufacturing buildings or building components or related products for extreme affordability? What are the impediments and opportunities in pursuing innovation for extreme affordability? What are some of the immediate and long term design research questions? How can these questions be integrated into architectural curricula? How can these questions address opportunities for mainstream or alternative professional practices? What are some successful examples that made a difference and hold potential to reach millions if not billions of people at the Base of the Pyramid?
 

100_7: THE INFLUENCE OF DIGITAL TECHNOLOGIES ON THE ARTS
Automatism, or, Post-Medium Architecture and Post-War Art
      Sean Keller, Illinois Institute of Technology
 
One of the consequences of architecture’s shift to computational representation should be the belated realization that, like the other arts, architecture has entered what Rosalind Krauss has called a “post-medium condition.” Until recently, and despite its ceaseless efforts at reinvention, architecture had remained comfortably grounded by a definition of itself as drawing-toward-building. Much contemporary exuberance and anxiety springs from the disruption of this long-standing convention, and the accompanying loss of disciplinary security. With clear relevance for contemporary architecture, Krauss has argued that the introduction of new representational technology reconfigures—and thereby exposes to scrutiny—the structure of a discipline in its entirety. This analysis of the post-medium condition in art provides a model for understanding the impact of computation on architecture, and it is in this light that this panel asks: What are the layers of conventions that are determining architecture in the age of computation? In the face of architectural production that seems to celebrate its lack of critical distance, is there a way to conceptualize the practice of architecture as differential and self-differing?

The exploration of these questions can be greatly advanced by returning to the inspiration for much of Krauss’s own thinking: Stanley Cavell’s work on film and postwar painting, and his central notion of automatism. Clustered around automatism are a number of ideas that can aid our understanding of post-medium architecture. Chief among these is the thought that, although automatism emerges out of the technical basis of a representational system, there is no essentialism in its determination. This idea is especially important since it calls into question recent attempts to use computational generation to establish new architectural fundamentalisms. Cavell also notes that, given the absence of stable conventions, the post-medium condition requires that each artistic practice establish the terms of art itself, and that this most effectively achieved by working in series.


Cavell’s description of automatism further suggests that there are important, parallels to be explored between contemporary architecture and the artistic practices of the decades following World War II. As design tools, parametric digital models raise many of the issues central to painting and sculpture from Abstract Expression through Post-Minimalism: the possibility of “all-over” non-hierarchical composition; the option for chance or non-deterministic design; an interest in process as much as, or more than, product; the re-appearance of pattern and decoration; the integration of representation and support; and the production of surprising optical and haptic effects through quasi-logical systems. In many respects computational tools have allowed architecture to move beyond the aesthetics of cubism for the first time. Beginning with the concepts of automatism and post-medium art, and suggesting a broad parallel between contemporary architecture and postwar art, this panel seeks to explore architecture’s new potentialities.
 
 
100_8: THE AFFECT OF COMPUTATION ON DESIGN PROCESS
Becoming Computational: Restructuring/ Reconsidering Pedagogy Towards a (More) Computational Discipline

      Christopher Beorkrem, U. of North Carolina at Charlotte
      Nicholas Senske, U. of North Carolina at Charlotte
 
Becoming Computational: Restructuring/ Reconsidering Pedagogy Towards a (More) Computational Discipline

Few would doubt the growing importance of computational methods and thinking within architecture, but what remains unclear is how a discipline such as ours becomes computational. In other words, how do we arrive at the point of integration, when architects understand that computation is not just a tool for helping design, but a way of doing design? When all designers --not only specialists-- can practice computationally and ruminate on the subject? The goal of this session is to examine the state of thinking on the matter, tracing possible trajectories and delineating obstacles on the way to making computation not the exception but a normative part of our profession.

We believe that the success of this transformation rests on something much greater than the adoption of a particular level of vocational skill. It requires a cultural shift. Our values, attitudes, and beliefs regarding design must change. As education is one of the primary instruments of implementing disciplinary culture, we propose an examination of the topic through the lens of pedagogy. Therefore, this session requests papers describing alternative project statements and courses, curricular structures or pedagogical viewpoints, which argue for a restructuring of architectural pedagogy to resolve our apparent separation from computational discourse.

A potential way to approach the topic might be to draw from the educational experiences of other fields that are in the process of redefining themselves computationally. For example: banking, biology, healthcare, journalism, etc. What can architects learn from their example? How has their way of working –indeed, their very perception of their discipline— changed and how has this been reflected in their training of late? What might be different about architecture that does not lend itself to the approaches of these fields?

Along similar lines, might students begin to learn about computation before entering architecture school? There are some who say that the ideas of computer science constitute a new literacy, that programming ought to be a prerequisite introduced long before college, much like math and writing. Would this benefit a more computational discipline of architecture? If so, how might professionals and educators help enable this change? Might we begin to expect experience and evaluate our graduate, or even undergraduate, applicants based upon their understanding of computation?

Alternatively, submissions might consider the question: where does computation fit into the architectural curriculum? Is the natural progression to add to the traditional sequence of two-dimensional drawing and three-dimensional modeling a capstone of advanced fourth-dimensional parametrics and multi-dimensional BIM? Or is computation a fundamental skill for designers, demanding a transformative redefinition of what we teach? Some institutions have proposed hierarchies similar to these. Others have proposed alternative sequences or the notion of abandoning hierarchy all together. What is the most desirable outcome, practically, culturally, and for the sustainability or survival of our profession?
 
 
100_9: INTERDISCIPLINARY E-MERGENCE
Beyond Digital: Speculations on Analog Convergence
       Brian Lonsway, Syracuse U.
 
As phone meets TV, google meets human anatomy, and video gaming meets body scanning, digital convergence rapidly challenges our known concepts of discrete objects, materials, and systems. However, as profound as some of these challenges may be, their ultimate reliance on identifiably discrete digital hardware marks them a technological development that is clearly apart from us, as biologically analog beings, and thus, somehow, more acceptably artificial.

However, as science makes daily advances in biological computing, where robots are made of gels, algorithms can be replicated in DNA, and phase change memory can store non-binary states, and as the assembly of components we still call computers are in our clothes, our trash cans, our vehicles, our pets, and soon, likely, in our bloodstream, we are beginning to approach the limits of digital computing. Numbers become qualities; logical operations become states of living matter; processors, living organisms themselves; and results, quite possibly affective states of being. In the near term, it is likely that these innovations, once outside the laboratory, will yield mere analog extensions of what we now comfortably understand to be the computer. But at their core, they propose nothing short of a complete breakdown of this digitally-obsessed thing we call 'the computer.' No longer reliant on binary switching and the concomitant discrete state functionality of the modern computer, nor any longer manifest as uniquely digital objects operating in our otherwise analog world, biological, molecular, and analog computing as it is variously termed removes yet one more barrier between what we might call 'artificial' computation and the rest of our biological, material, and cultural life processes.

Before these post-digital objects come to replace our desktop computers as design tools, before what we now call software becomes a structured biological organism, and before our building materials transform from inert to active matter, we may want to speculate a bit more rigorously about their implications for design. The digital has transformed our practices; this we know. And so will the analog…but how? What might it mean for designers, who for decades have been demonizing or deifying the digital computer as advancing something against our traditional (analog!) work processes, when our paper becomes simultaneously material surface and computational object? What might it mean when materials are not simply 'smart' because of their physical responsiveness or ecological efficacy but because their very molecules are computing their physical properties? What other transformations of our known interactions with digital computation will wane in the advancing convergence between computation, organism, and material?

This session invites rigorous speculations about the future of computing and its implications for design processes, spatial thought, and the practices and pedagogies of architecture. Submissions by multi-disciplinary teams of authors are particularly encouraged. Papers may take the form of experimental findings, design projects, philosophical or theoretical provocations, critical fictions, or pedagogical explorations: of greatest importance is the embrace of a speculative framework grounded in current scientific research in non-digital computing.
 
 
100_10: PARAMETRIC PERFORMANCE
Design Computation:  Parametrics, Performance, Pedagogy and Praxis
      Karen Kensek, U. of Southern California
 
 Parametric software, fed by cheap desktop computing power, reasonably user-friendly parametric software, and an overwhelming, unrequited love of NURBs and curves, has led to a preponderance of generative form-making in architecture design studios and in select professional firms. This increasing use of parametric design in academia and practice is in part due to its capability of producing great variability within a set of constraints, creating variety that can be purposeful and responsive within a specified design space.

By itself and even to a greater degree when combined with algorithmic procedures, parametrics changes the design process by requiring a strong initial focus on developing a workflow that will allow the designer flexibility downstream. The architect adds scripting environments, coding, and a true knowledge of the parameters controlling the geometry to his toolbox. Ultimately, this process increases the designer’s awareness of the implications of each design decision.

Yet, the parametric process as currently driven seems to be more concerned with generating many forms that can then be evaluated for “aesthetic fitness” by the creator than in determining and applying performance metrics that could be used to evaluate the derived designs. Opportunities to include parameters and design intelligence, especially in conceptual design, that respond to climatic considerations, structural limitations, construction realities, future energy and water scarcity, information modeling, ecosystem balancing, and even community group advocacy are often overlooked. If parametricism is indeed the latest global “style” within architecture and urbanism (Schumacher, 2009), then we must ensure critical analysis and open dialogue on methods and processes.

Imagine parametric software fed by ubiquitous cloud computing providing immense processing power, informed by environmental, occupant, and even crowd sourced data, and pushed by collaborative teams of architects, engineers, and owners led by evidence influenced designers. Envision a design process that never ends, and information accumulates that can be accessed by occupants, owners, and designers of retrofits. The resultant building is an intersection of the real and the virtual, formed by conceptual parametric design, building information modeling, and performance simulations and then incorporating smart features, sensors, and cognizance of the environment and its own performance metrics to provide not a machine a habiter but an intelligent skin for people and their activities.

This panel will critically examine “infinite computing power,” parametric modeling, and a form of environmental entanglement where simulation and sensors inform and predict performances both before the building is constructed and then throughout its lifetime. The profession and the academies must come together to help determine the possibilities for the next generation of building intelligence and integrated process technology.
 
 
100_11: FORMAL PLASTICITY AND DETAIL INTRICACY
Digital Details
       Matt Burgermaster, New Jersey Institute of Technology
 
 “In this new architectural domain, joints just don’t matter”. This was one of many provocative claims made by William J. Mitchell in his influential essay ‘Antitectonics: The Poetics of Virtuality’. Recognizing an extraordinary paradigm-shift underway at the close of the 20th century, he characterized the emergence of digital technology as having the aptitude to so thoroughly detach value from the physical world that architecture’s primacy should no longer be ascribed to the static tectonics of building construction but to the dynamic flows of virtual information that pass through it. Eschewing the discipline’s traditional referents of material, gravity, and environment, the essay located architecture as a material thing and proposition on the losing side of history. Mitchell advocated that digital technology offered more than just a new ‘toolbox’, but also a performative capacity that could enable the conceptualization of a new architecture of immateriality, weightlessness, and seamlessness. This radically reconfigured what was at stake for architecture’s details. With the emergence of this new architecture of digitally-driven dematerialization, the detail disappeared - not only from the surfaces and forms of the architectural object, but from the discipline itself.

Today, however, the architectural detail is experiencing a renewed agency in digital practice. The development of information modeling, parametric design, and C.N.C. fabrication has opened up new relationships between previously divergent terms of virtual and material in ways that evidences alternative digital aptitudes for the detail. These new production capabilities have generated interests in concerns such as building performance, assembly, and articulation. But, if the details associated with these traditional disciplinary concerns are an effect of software advances alone, then it might be said that contemporary detailing is a practice without a (digital) discourse. Consequently, for this technological paradigm to have full effect on the detail, it may not only need to be made digitally, or appear digital, but it should be digital. In an echo of Mitchell (and Nicholas Negoponte before him), Antoine Picon recently suggested that “…the digital is first and foremost a mode of being, a human condition that will eventually permeate all aspects of life. Being digital is not primarily about using a computer in the design process, nor about making this use visually conspicuous”. In response to this predicament, the digital detail has developed two distinct - but perhaps enigmatic - modes for its disappearance: as inconspicuously ‘nowhere’ and conspicuously ‘everywhere’. As emerging digital technologies continue to become more and more sophisticated and ubiquitous, the detail’s future may be a “digitally minimal” one.

This session topic invites papers to critically assess and/or projectively speculate on the architectural detail in an age of digital ubiquity. Positing that digital aptitude and the detail’s contemporary disappearance might have something in common, this session asks:

- What kind of agency does the detail acquire (or lose) by being digital?

- What kind of agency does the digital acquire (or lose) by building the detail?

- What is at stake for the detail in an architectural future whose aptitude is to be digital?
 
 
100_12: AESTHETIC INNOVATION
Digital Nouveau and the New Materiality
      Armando Montilla, Clemson U.
 
In 1883, Arthur Mackmurdo published what would be considered the first illustration within the genre of Art Nouveau. Deeply influenced by the Arts and Crafts movement initiated by William Morris, Mackmurdo initiated the Century Guild and, in 1884, founded the movement's journal Hobby Horse. His conception of design, particularly graphic printing and his portrait on Christopher Wren City Churches, among the incunabula of Art Nouveau, anticipated the style of the Art Nouveau movement.

Today’s digital fabrication techniques, triggered by Software development and its capacity to generate complex geometries, numeric processing spelled by the computer – the analogue mind behind the designer’s mind – added to the simultaneity of information and the capacity of continuous reproduction, reduction of time equation and ability to rapid simulation; have all allowed the advent of a new interface, based on the binary language and multiple array of calculations. Such an Interface fits the spirit, the ethos of contemporary society, where information and data are of instant availability and perfect simultaneity. In the midst of David Harvey’s Post-modernity, flirting with Paul Virilio’s worship of speed, this interface adopts the program and the administration of successively: Command, datum and parameter. Right here, the intuition is replaced by the narrow to exact calculation, in which the tridimensional model becomes the aesthetic instrument of assessment, and the eye remains the means of measurement, while nature rests as absolute source of inspiration.

In Art Nouveau, the intersection of construction techniques and the inspiration from nature, created the intellectual nest of a unique sense of craftsmanship, one which rebelled against Beaux Arts. Deep changes in society reflected these tendencies and allowed for revolutionary techniques: A new order, a new way of looking at the object d’art. If we translate their aesthetic vision to today’s realities, we’ll find the interface of aesthetic creation, the intuition of the designer to test and probe new and experimental methods, the desire to represent and test their fabrication, through various techniques of representation and prototyping: To build before building, it comes to be a step towards the built project. Just as in Art Nouveau, the Interface is located in the mind of the designer, but has prosthetically moved, and the holistic view of architectural design embraces a range from architecture to product design, to cities, to structural systems; and so the ‘total’ Art Work, the Gesamtkunstwerk, is achieved in a perfect visual fetish for the screen.

What are the historical precedents that can be read in today's digital design techniques? Are concepts such as Riegl’s theory of Kunstwollen, or even Wörringer's empathy or Einfühlung, not only influential for Art Nouveau and Jugendstil, but also sources for today's digital frontiers? As art and technology, science and aesthetics all come together in this festive Gesamstkunstwerk of design; how does this evolve into a new type of Materiality? What historical learning remains most pertinent to informing emergent computational aptitudes? The purpose of this session is to explore these precedents, and to establish the relationship of history with the present digital aptitudes.
 
 
100_13: EMERGING ECOLOGICAL MATERIALS AND ENVIRONMENTS
Emerging Materials, Renewable Energy, and Ecological Design
      Franca Trubiano, U. of Pennsylvania
 
Renewed interest, in both emerging materials and renewable energy has greatly contributed to an enrichment of ecological design principles during the past decade. Since the 1996 publication of Van der Ryn and Cowan’s Ecological Design, product designers, architects, landscape architects and urban designers have increasingly positioned ecological accountability at the center of their innovative designs. Building materials and the energy consumed in their production are both factors that significantly contribute to the selection of a project’s palette. Reviewing the embodied energy of materials used in a project has become an essential part of the design process. This, however, is only one way in which matter and energy are codetermined and codependent in what regards the work of architecture.
Most recently new and experimental materials have been used alongside energy generating technologies in designs that seek to maximize their mutual benefit. Advances in material sciences have resulted in a host of new nano-engineered products made of polymers, composites, carbon fibers and even digital materials; all of which possess an entirely different relationship to energy than traditional materials of the 20th century. Reciprocally, industry advances in the channeling of renewable energy via thermal mass, translucent insulation, solar cells, thin film or crystalline, and light emitting surfaces of all kinds have redefined our expectations of how the energy embodied in daylight can contribute to an expanded definition of material performance.
This conference session invites the submission of papers dedicated to exploring the productive dialogue that exists between matter and energy, materials and sunlight. As advances in material science and renewable energies continue to transform the larger world within which we build, it is the aim of this session to highlight corresponding transformations within the immediate context of architectural design. Whether evidenced in innovative studio projects, experimental prototypes, conceptual designs, built projects, or advances in building products, a rigorous discussion of the various ways in which the coupled use of “smart” materials and renewable energies contributes to an expanded definition of ecological design is the intent of this session.

 
100_14: DIGITAL NETWORKS: COLLABORATIVE PRAXIS
Integration, Not Segregation: Interdisciplinary Design Pedagogy for the Second 100 Years
      James Doerfler, California Polytechnic State U.
      Kevin Dong, California Polytechnic State U.
 
Over 100 years ago the Deutscher Werkbund integrated architects, engineers, designers and industrialists into teams in an attempt to upgrade the quality of product design in Germany. This movement put the designer in the position as mediator between invention and standardization. Walter Gropius and colleagues adopted this position at the Bauhaus, creating the most influential interdisciplinary design school in the twentieth century. Over the course of the twentieth century Eames, Fuller, Foster, Kieran Timberlake and others have promoted the inclusion of multiple disciplines on a design team. Recently we have seen the complexity of building increase to the point where legal requirements and sustainability guidelines influence the need to develop buildings that require integrated project teams. The increasing complexity of design and engineering that is required in projects also contributes to the necessity of interdisciplinary teams of architects, engineers, material scientists and construction managers to be able to bring these projects to fruition.

While a small number of practices worldwide have this professional collaborative experience, bringing this experience to the classroom is still a rare occurrence. In the twenty first century, the integrated project team will become the typical method to deliver the buildings of the future. Can the interdisciplinary nature of the Bauhaus curriculum be re-tooled to prepare students to become architects in the twenty first century?

The goal of this session is to identify approaches to interdisciplinary design education and reveal ways of making pedagogical changes that integrate multidisciplinary teams into a design curriculum. The session can explore the many issues inherent in the bringing together of different disciplines and discuss language and communication, digital information, roles of members on the design team and the relation of a multidisciplinary class to a single discipline curriculum. The participants may elaborate on how the inclusion of building technology, design studio, history/theory and other aspects of an architectural curriculum, as well as the addition of engineering, construction management and other related disciplines can be integrated into an effective pedagogy. Presentation topics may include case studies from the design studio, critiques of design team strategies, development of communication skills on interdisciplinary teams and use of digital tools.

Among the questions that enter this discussion are:
Can design inquiry be informed by a collective knowledge of many disciplines and create the best possible outcome?

How does an integrated design team change the nature of authorship of design?

Can the quality of the projects be changed by an integrated project team?

Can built environment and materials research and innovation be increased by using integrated projects teams?

Do interdisciplinary design teams “water down” the quality of design work?

Do digital tools increase the communication and effectiveness of interdisciplinary project teams?

Is it possible to create an effective interdisciplinary learning environment in an academic setting?
 
 
100_15: RELATIONAL MODELS AND RECALCULATIONS
Post-Parametric Environments
      Jennifer Leung, Yale U.
 
 One of Marshall McLuhan’s central arguments, identified in “The Invisible Environment: The Future of an Erosion” as “counter-environment” (Perspecta, 1967), is that transformative new technologies eventually bring the social and sensory consequences of superseded technologies into relief “through the rear-view mirror.” For McLuhan, the Greek oral tradition is the counter-environment of written language; Romantic landscape, the counter-environment of the railroad and factory; technologies of classification, the counter-environment of cybernetic pattern recognition. Counter-environments change the very nature of perception, and by extension the opportunities for intervention. McLuhan’s dialectic is neither oppositional, nor mutually exclusive, but involves positioning. His counter-environment does not destroy the environment, it frames and creates awareness, suggesting new models of engagement with present, past, and future conditions.

This session will posit that the parametric is already a historical periodization and, as McLuhan might argue, is the counter-environment to the current state of pedagogy and practice. Thus, the post-parametric is not the intensification of the same or similar codes and processes, as in the rationalization of curvature, optimization of form, or even the potential democratization of compute power afforded by cloud computing, self-modeling buildings, or personal super-computing. Instead, fundamental assumptions about parametric thinking, design, and computation need to be re-assessed according to the physical realities of our actual environments and sensory thresholds. Questions of perception, scalability, technology transfer, translation (not from drawings to building, but from models to models), and the construction of evaluative criteria for iterative design are critical to this re-alignment.

For example, supplementing the familiar celebration (or castigation) of geometries freed from industrial production or of capital freed from the rules of arithmetic, recent conversations have begun to take up the digital division of labor and the management of economic and ecological risk, furthering the disciplinary understanding of socio-economic relationships of distribution, communication, and production. On the other hand, the ubiquity of data processing and the binary code has homogenized, rather than adjudicated, post-war technological debates about the status of matter, energy, biological life, time and subjectivity. In other words, the naturalization of the parametric environment has surpassed the thermodynamic, molecular, relativist, and psychodynamic points of view which historically have shifted cultural notions of space and time. However, research in these areas, now on the other side of a disciplinary divide, has not ceased.

This session seeks to bring together various positions on the post-parametric environment, which will allow us to reflect generally on obsolescence, or which via reconsideration of obsolete diagnostic protocols, sensory thresholds, or representational schema in practice or allied fields clarifies the near and distant future. What forms of technology will bring parametric thinking and computing into relief? How have perceptual and representational regimes kept pace, or not, with computing and fabrication? What is the status of “environment” after the parametric?
 
 
100_16: DIGITAL SYSTEMS IN URBAN ANALYSIS AND DESIGN
Registration and Projection: The Mediations of Urban Imaging Technologies
      McLain Clutter, U. of Michigan
 
 Our discipline’s understanding of cities is intricately intertwined with our ability to document aspects of urbanism in measurable representations. History is replete with examples. Nolli’s plan of Rome concretized the conception of the city as a spatial construct of public occupation. Filmic representations of urbanism, as theorized by Benjamin, Kracauer and others, elucidated the kinetics and subjectivities of the early metropolis. More recently, GPS mobile devices have inspired thought about the networked nature of contemporary urbanism. With each of these developments, technological advances in representing urbanism opened epistemological regimes for studying and designing the city. Equally, through the mediation of each technological advance, new definitions of urban subjectivity and the public have been implicated.

The fast-paced development of contemporary digital technologies has accelerated this epistemological expansion. From G.I.S. mapping applications, to parametric urban simulations, to remote sensory photography, new technologies have broadened our ability to register previously invisible aspects of urbanism. Now apparent are latent patterns of human occupation, modulations in urban energy flow and consumption, and various time-based urban phenomena. While the broad adoption of these technologies promises to deliver an unprecedented amount of data from which to leverage design strategies, it also threatens design with a new positivism or technological determinism. Too seldom are the epistemologies of these technologies interrogated: What information do they allow, what do they exclude, and what are the inherent potentials or ideologies of these operations? Too seldom, also, are new imaging or simulation technologies examined in terms of the urban subjectivities or publics they implicate. This session solicits speculations, and papers about contemporary or historical work, that discuss the epistemologies of technologies for representing urbanism. Submissions should evince neither blind critique, nor blind fervor for technological determinism. Rather, each paper should discuss how the epistemological potentials of urban imaging and simulation technologies have been, or might be, knowingly wielded to produce of new ways of conceiving of urbanism and new ways of producing an urban public.

 
100_17: DIGITAL NETWORKS: EMBEDDED UBIQUITOUS COMPUTING
Situated Technologies
      Jordan Geiger, U. at Buffalo, SUNY
      Omar Khan, U. at Buffalo, SUNY
      Mark Shepard, U. at Buffalo, SUNY
 
Since the late 1980s, computer scientists and engineers have been researching ways of embedding computational intelligence into the built environment. Looking beyond the model of personal computing, which placed the computer in the foreground of our attention, "ubiquitous" computing takes into account the social dimension of human environments and allows computers themselves to vanish into the background. No longer solely virtual, human interaction with computers becomes socially integrated and spatially contingent as everyday objects and spaces are linked through networked computing.
 
Today, researchers focus on how situational parameters inform the design of a wide range of mobile, wearable, networked, distributed and context-aware devices. Incorporating an awareness of cultural context, accrued social meanings, and the temporality of spatial experience, situated technologies privilege the local, context- specific and spatially contingent dimension of their use. Despite the obvious implications for the built environment, architects have been largely absent from this discussion, and technologists have been limited to developing technologies that take existing architectural topographies as a given context to be augmented. At the same time, to the extent that early adopters of these technologies have focused on commercial, military and law enforcement applications, we can expect to see new forms of consumption, warfare and control emerge.
 
This panel seeks to occupy the architectural imaginary of these emerging technologies- sensors, embedded and mobile computing- and propose alternate trajectories for their development. What opportunities and dilemmas does a world of networked "things" pose for architecture and urbanism? What distinguishes the emerging urban sociality enabled by mobile technologies and wireless networks? What post-optimal design strategies and tactics might we propose for an age of responsive environments, smart materials, embodied interactions, and participatory networks? How might this evolving relation between people and "things" alter the way we occupy, navigate, and inhabit the city? What is the status of the material object in a world privileging networked relations between "things"? How do distinctions between space and place change within these networked media ecologies? How do the social uses of these technologies destabilize rationalized "use-case scenarios" designed around the generic consumer?
 
Finally, this panel seeks to focus these questions around specific contexts that may only be emerging now and deserving of new attention. These may be geographical contexts such as non-Western milieus, contexts of identity including race- and gender-specific areas of consideration, or areas of the built environment such as agriculture that may be less frequently addressed to date.
 
 
100_18: DIGITAL MNEMONICS: THE PEDAGOGY OF HISTORY
Teaching History in the Digital Age
      Carla Keyvanian, Auburn U.
 
Technological developments are transforming the systems of architectural production and traditional relationships among clients, designers and builders. These changes are profound, no longer a matter of using computers to produce drawings previously prepared by hand, as happened with the introduction of Autocad. Building Information Modeling (BIM) software now produces dynamic digital building models that are queried to provide information ranging from building geometries to spatial relationships, geographic data, and energy preservation performance.

What is underway is a shift in the way we think of the design process and its product, not simply the way we represent our designs. The severance with our classical past appears more evident than ever. In such a context, what is the role of architectural history in educating future architects? The students currently enrolled in schools of architecture will be called upon to navigate epochal changes in architectural production. Is it still useful for them to spend precious time studying the proportions of Greek temples or the ornamentation of Renaissance palaces?

The underlying assumption of this session is that the teaching of history in schools of architecture is more necessary than ever because learning history means understanding change, not remaining fixated on outdated models. Understanding the technical, cultural and social changes that accompanied the Industrial Revolution, for example, helps put current events into perspective. A comparison with the Renaissance—when perspectival constructions enabling life-like spatial representations did not simply change the way architects drew architectural space but the way they conceptualized and designed it—is equally pertinent.

Much of what students learn today in studio courses might be obsolete by they time they are licensed architects. The critical thinking that history and theory courses foster is less perishable. Some of the central questions raised by the theme of this Annual Meeting are most aptly addressed in history and theory courses—the need to challenge naïve assumptions about the consequences of western technological positivism, for example, or how technological advances might destabilize established spatial or social definitions.

The suggestion that such theoretical concerns need to be addressed from the start, in history courses aimed at beginning students, is implicit. This session seeks papers that either challenge that assumption, or explore ways of teaching history courses that foster the critical thinking and analytical tools necessary to the young architects that will steer the course of current transformations. How can history and theory courses help understand the new roles that they will have to acquire in the digital paradigm? What is the relationship between that paradigm and the opposite trend that has emerged recently, focusing on community-based architecture with low technological content, minimal impact, but that does not eschew formal concerns? In sum, this session intends exploring the range of possible answers that can be provided to an overarching question of this Annual Meeting: What historical learning remains most pertinent to informing emergent computational aptitudes?
 
 
100_19: DIGITAL SIMULATION
Theoretical Implications of BIM: Performance and Interpretation
      John Folan, Carnegie Mellon U.
      Ute Poerschke, Pennsylvania State U.
 
In a simulation driven design context the significance of information is often assumed to be absolute; information is equated with performance. But the significance of information in a performative context is interpretive. The locus of the information’s value is assigned by those utilizing it - and, there are different forms of simulation that garner categorically disparate forms of information in combination and relative isolation. Some forms are technical, some reductive, and others that remain contextual and projective.

At a moment where simulation driven design is being absorbed into architecture curricula on a broader scale it is appropriate for the academy and profession to reflect on the theoretical implications of the technology. In design studios and seminars students are utilizing Building Information Modeling (BIM) software to explore its underlying concepts and potential benefits through collaborative design and integrated project delivery. In cross disciplinary advanced construction environments, facility management and life cycle processes are being explored. In those contexts, BIM is primarily utilized as an analytical tool to increase efficiency in design, production, construction and occupancy phases. In design, fundamental simulations can provide specific predictive information about solar access, total solar radiation, thermal efficiency and fluid dynamic behaviors. In production, clash detection, automated schedule generation, and 4D construction modeling represent only a few of the predictive technical interpretations that may be employed through simulation. In both realms there is a dedication to the technical interpretation of information – an interpretation that runs the risk of being non-reflective and may only affirm assertions associated with functionalist thinking.

By contrast, there are circumstantial contingencies that suggest several models of theory that may be invested in the methodological employment of BIM. In general terms, information modeling provides an environment in which information can be attached to three dimensional representations as building element information. Conceptually, by ordering these elements as process information, there is no limitation to the kind of information that can be stored. Beyond the generation of prosaic cost and element scheduling, potential exists more speculative forms of information might be cataloged in recombinant form - the visceral qualities of paintings in particular spatial contexts combined with demographic information in a neighborhood model or, individual formal ordering preferences in a residence combined with efficiencies in the assignment of recycled content building materials to the enclosure system. In speculating, BIM can be considered beyond its instrumentality and evolve as mechanism to mirror the culture in which we live.

This session seeks to solicit discussion focused on proposals for potential theoretical orientations in Building Information Modeling. The mechanisms to engendering positive and relevant sensibilities in the academic setting should be considered fundamental. Papers focusing on instrumental reason in the context of situated understanding of place, population, representation, and purpose are requested.