(Originally written for MS degree at University of Michigan)
Digital technology has been steadily studied and experimented within the field of architecture and its potential has yet to be fully exploited. Particularly designers’ knowledge in computation and algorithm, which trace back to mathematics roots, has matured to the extent of industrial deployment, yet the business of design practices has lots of vacancies where expertise in digital technology can tremendously enhance productivity, which is mostly proportional to profit. The profession needs to engage in widespread computational design in the face of “wicked problems”, coined by Roger Martin in the book The Design of Business[1]. Digital technology is the special task force architecture has against those difficult problems.
Here digital technology may be used interchangeably with design computation, algorithm aided designs, or many other nuanced terms. The fundamental idea remains that architects need be better versed in a multitude of software, know computer programming rudiments, and design more through parameters. Architects equipped with these tools also introduce a new business strategy for the practice.
First of all, computational thinking in architectural design is a well-considered concept. Perhaps designers tend to regard it as a recent side-effect of highly evolved computers and information technology, but it has developed, in fact, alongside the progression of hardware. Because architecture is an expression of forms and forms tie largely to geometries, scholars have already extrapolated much from previous theories concerning geometries, and moreover mathematics. Regarding computation, in a nutshell, the philosophical implications come down to what Jane Burry of RMIT University would call a “tussle between intuition and logic”[2]. Whether it is inhuman of us to pursue the rigid logic behind digital technology which is largely deterministic in nature is perhaps a debate beyond buildings.
In their introduction to Computational Design Thinking, Sean Ahlquist and Achim Menges quoted writings of the 1960s on design, arguing that in architecture a problem does not consist of sub-problems we can tackle one by one. The interactions between parts make the whole larger than simply the sum. In short architecture is a system, not a static object in isolation. Computer’s ability to provide stability through “cyclic-causality” or feedback loops offers us the understanding of system operations[2].
The two authors continue to mention parametric design in relation to form morphology, a term introduced as early as the year 1796. In this case computation and algorithm concern more with style rather than productivity. Style is nevertheless commercially applicable as seen by architects in practice such as Patrik Schumacher or Frank Gehry, and in academia such as SCI-ARC or University of Pennsylvania. However style can rarely be widely adopted before losing its unique value. Architects should first consider design computation more as a modal shift in business rather than a stylistic innovation.
In practice we see the specialist modeling group at Foster + Partners. An example product of this group would be the Swiss Re headquarter building in London. Finished right around the turn of the century, its design phase dates back before the first Revit software, which still had a good many years before becoming a common staple. The form-finding process of the curved tapering was defined by a few geometric parameters, through which variations of the tower shape can be visualized and explored rapidly.
One instance of a wicked problem emerged when the City Corporation, the committee who establishes guidelines and maintains regulations for building projects in London, returned the scheme for another round of design after a full-fledged package was submitted for approval, simply because they believed that a slightly slender profile could benefit the image of such a building[3].
In school iterations are good for honing design skills. In practice iterations are good for closing in on optimal solutions, but they are costly exercises. If Foster + Partners hadn’t had pioneered into parametric modeling, where a change in numeric value translates to a taller building and the floor plates as a function of exterior profile re-adapt themselves to the new shape, the design team would have had to draft from a blank sheet of paper all over again just to have three more floors.
It should be noted that sometimes adding floors means duplicating existing drawings easily. But with the iconic diagrid/conceptual monocoque leading up to a fuselage shape of Swiss Re’s headquarter, each component of the building is susceptible to a tweak of the whole system.
The specialist group at Foster + Partners was established in 1998. At the time no architectural software readily facilitated the kind of algorithmically aided design the firm wanted to do so a small internal team of architects with interests in technology and scripting had to alter existing CAD program, as Xavier De Kestelier puts it in his article on Architectural Design, to “misuse” bits and pieces of their functionality[4]. When we look at the timeline of the Swiss Re project, it coincides with the birth of the specialist group. The project could well be an impetus to the entire digital initiative. However, today architectural practices are still plagued by the indecisions (albeit often well-reasoned) of their clients, their city planning committees, consultants, contractors and possibly activist groups. The moment when design firms embrace digital technology, architects will answer fewer demands but articulate better ideas.
A traditional architectural firm may often rely on seniority in decision-making. What has been done before will be done again because it will work if it has worked. This hunch that comes from years of practice can very often be quantified with computation and algorithms now. An optimized solution use numbers and statistics to show that it will work because the most possible needs are met. In the words of Roger Martin, architecture firms’ bank of knowledge from previous projects is a collection of heuristics. They are dependent on the people who possess them. Once the people leave, the firm loses the heuristics. Design computation pushes knowledge right down to the algorithms, the bottom phase of Martin’s knowledge funnel. If an office can produce based off of algorithms, it will be a much more profitable mode of operation than working on trials and errors, the primitive top of the knowledge funnel.
Eighteen years after Foster + Partners created a technology oriented specialist team, many architectural giants have since heralded the implementation of the same expertise in their practices.
Kohn Pedersen Fox Associates (KPF) has set up a research initiative in its office called the KPF urban interface, or “KPFui” for short. Due to the high volume of “supertalls” in the firm’s design works, one of the tasks for KPFui is to investigate the impact of high-rise buildings on their urban environment, quantify it, and push back on the formal design of the building.
In a lecture given at MAS Summit for New York City 2015, KPFui’s director Luc Wilson gave three analyses that used the methodology rooted in design computation[5]. His co-speaker Jesse Keenan likened the methodology to doctors’ MRI, a technology that didn’t supplant medical expertise but rather informed faster decisions. I elaborate here two of the analyses to demonstrate KPFui’s value to its mother ship office.
The first analysis was on direct sunlight. Some building codes may regulate buildings by the 1-hour continuous shadow they generate. Luc and Jesse showed a diagram where KPF’s proposed building was seen to comply with that regulation. Although the speakers never specified the tools and expertise involved in generating and utilizing the diagram as they might be considered trade secret, my experience helps me surmise that they parametricized the height or girth of the building mass, ran through a ray-tracing program or script not dissimilar to that of popular rendering engines, and tailored the parameters so the shadows fit. This is known as solar-carving, a common massing study for high-rise architecture in many offices. With computation, this task can be assigned to one person to be completed within a matter of few hours.
Next the analysis was on ambient light. It was quantified as amount of sky visible at a given vantage point behind a window. A new building blocks visible sky from its neighbors. However, this visibility depends on the view angle. The closer one gets to the window, the bigger angle at which one looks up, the more sky s/he sees. Here formal decisions at stake are truly driven by a set of constraints in numbers. What better tool can we use other than design computation? In the case of KPFui’s study, they used the findings to determine the tapering of a proposed supertall so it allowed the same amount of visible sky as the existing low-rise on site.
Single buildings impact the greater environment. Collectively buildings comprise the city fabric. The highly urbanized communities where we live bring forth new urban challenges. KPFui has taught us that through smarter tools architects can design better buildings, therefore better urban environments. Another study it conducted serves as the case in point. Under an evolutionary solver embedded in a popular scripting tool, KPFui successfully managed the intricate correlations among nearly a dozen of urban factors in a group of buildings, such as FAR, city block constraints, sky exposure etc.
Smaller practices may also have informal teams with high digital technology caliber. I have previously worked at Pelli Clarke Pelli Architects (PCPA), an office 130 designers strong. In relation to the geographic scope of their work, the firm is of a rather moderate size. However, a team of 3 core designers often employs scripting tools and parametric design principles to aid their processes, despite their usual role of carrying projects from pre-schematics to construction administration.
For example, this team of 3 was working on a large mix-use commercial project that had two point towers supported by a retail podium, when I joined briefly to expedite architectural productions. The primary tower was approximately 70-storey tall and its curtain wall panels numbered in the thousands. With a gently tapering building form, drafting any representations of these panels would mean large amount of man hours. Alternatively, the team adopted a tessellation tool in one 3d software that handles geometry accurately, exported the geometric information into data spreadsheet and translate them into a Building Information Model (BIM). The time the designer - yes a sole designer - had to invest was simply on prescribing rules of panel division. Afterwards, the designer pressed a button to deploy and thousands of virtual curtain wall panels practically “climbed” up the digital model space on their own. With the BIM’s capacity to output drawings, the designing as well as representing a myriad system were merely on a 2-day turnaround.
The industry surely has moved away from pencil and paper but alongside this group at PCPA, other designers were working out their math, clicking in each line in a drafting software, and limited to rudimentary forms with regularities and repetitions. Of course we must disclaim intentional complexity. Simple forms are often the most powerful and effective solutions in architecture. However in a circumstance described above, knowing how to rationalize a scheme in parameters and automation benefited the design and the workflow. I speculate that once PCPA formalizes these digital talents, putting them under the same operation such as that of Foster + Partners’, the efficiency will no longer be just project-specific.
The cost of running such a digital group on a design team is rather low. Little is added to a firm’s existing infrastructure. Smaller prototyping work can be done with the in-house handcrafting personnel. Larger fabrication testing is usually a collaboration between professional fabricators and architects. The nature of a digitally savvy group requires only computers. Considering the speed it brings to architectural production and the resources freed to invest in schematic depths, every architecture firm must have a group like this.
Furthermore as a point of departure, these in-house digital groups can even operate as independent businesses, hired by architects of other means or professions that seek consultation. NBBj boasts a “proprietary computational tool” that helps determining the “…right mix of services and spaces for a healthcare institution”. The analytics to which a specialist team look will sometimes yield non-formal results nevertheless valuable to those outside architecture. Digital technology has propelled more than a few industries in the past few decades and architects shall not be slow to the game.
[1] Roger Martin, The Design of Business (Boston, Harvard Business Press, 2009) [2] Achim Menges and Sean Ahlquist, Computational Design Thinking (West Sussex, John Wiley & Sons Ltd, 2011) [3] Kenneth Powell, 30 St Mary Axe, a Tower for London (London, Merrell Publishers Ltd, 2006) [4] Xavier De Kestelier, “Recent Development at Foster + Partners’ Specialist Modeling Group”, Architectural Design magazine, Vol. 83 Issue 2, p22-27 [5] Luc Wilson and Jesse Keenan, The Science of Supertall (presented at Municipal Art Society – Summit for the City of New York, 2015)