Professor Gregory Fenves could hardly have picked a better location to study earthquakes. His office on the University of California, Berkeley campus sits almost on the Hayward Fault, which runs just east of Davis Hall, where Fenves is the chairman of the Department of Civil and Environmental Engineering. From there, the Hayward slices right through Memorial Stadium where, each fall, crowds of 70,000 come to watch Cal football. Relative to each other, half the stadium is poised to lurch slightly towards Canada, while the other half aims for Mexico.
Fenves's research interest is in using computers to predict how structures of various designs will hold up in earthquakes with varying vector forces. Where some earthquake researchers employ shake tables, centrifuges, and reaction wall systems, Fenves and his graduate students stage virtual quakes on computers. "Physical and software simulation have developed somewhat in parallel," he says. Software simulation relies on algorithms and data models, as well as the kind of number crunching that has accelerated over the years with Moore's Law. Meanwhile, physical simulation "is important for understanding what happens in the real world under controlled conditions. We use the data to validate and calibrate our software models."
Both the physical test equipment and software analysis comprise the research conducted by the Pacific Earthquake Engineering Research Center, a 10-year project funded principally by the National Science Foundation. PEER's website gives much detail about the actuators, controllers, and pounds of force used to re-create earthquake conditions, but the over-arching message is this: when it comes to designing structures for earthquakes, simulation is much preferable to the real thing.
Software simulation involves the use of finite element analysis, which is also employed for virtual product stress testing-the kind used to analyze, say, aircraft designs. That is what Berkeley's Ray Clough was doing when he first coined the phrase "finite element method" back in 1959. "He called it that because the technique turns a continuum into discrete elements that have simple mathematical properties, which can lead to an understanding of larger systems," Fenves says. Researchers have used FEA software in analyzing structural design for earthquakes since the early 1970s, "but the software was incredibly crude compared to what we are now doing. FEA programs now accommodate far larger models and deliver more precise results. But to push the technology even further, at least two essentials are needed: a coordinated software development effort with common tools, as well as a few smart civil engineering students who can also program. Both these non-linear problems have occupied Fenves for the last 10 years or so.
"We want to build upon each person's research accomplishments without every person having a different version of the code," Fenves says. "And when someone develops a model or equation solver or a time integration scheme, we want a common software framework to ensure they all work together." He is of course describing the methods found in the open source development model. That may seem like old news, but in the structural engineering field, the idea is relatively new. And for good reason: the open source movement was created by full-time programmers who think a lot about software development, whereas engineers prefer to spend their time doing engineering. "In the Unix/
In the mid-1990s, Fenves's graduate students developed an object-oriented development framework called Open System for Earthquake Simulation. The framework is written in C++, with a unified modeling language to define a set of extensible classes in the areas of modeling, analysis, and structural reliability. It is platform-neutral, running mostly on Windows, but also on Linux and MacOS. It includes wrappers for Fortran, a language still in wide use in among civil engineers. The framework's creators admit they have freely "swiped" ideas from the Open Source Initiative, reflecting that philosophy in the short version of the name: OpenSees.
The second challenge for Fenves has been in finding civil engineering students who know enough about computing to take a few courses outside the department and learn about object-oriented design, APIs, and C++. To paraphrase the U.
"This problem extends through much of science," Fenves says. In the big computational science projects, the software people and the science people come from different planets, and they have to figure out how to work together. Occasionally, someone enters the field with both skills, which, after 25 years, describes Fenves himself. "I like to work at the boundaries between computing and structural engineering-to bring the two together."
Fenves says that C++ is especially daunting for students, but remains the overall best option for software design. At Berkeley's civil engineering department, Matlab is the teaching language of choice. "It's a powerful package put out by The MathWorks, an interpreter set up for mathematical processing. The interpreter makes it a great teaching language but it's too slow for large research programs." Fenves thinks the department should also be teaching Java because of its object-oriented structure. "It's a pretty clean language compared with C++, but others think it will be too confusing for students to learn two languages: Matlab and Java."
OpenSees also uses Tcl, a pre-Java scripting language created by a former Berkeley computer science professor, John Ousterhout that enables engineers to program their models without having to dive into the intricacies of C++. "Tcl executes all the OpenSees constructors. It's a first-class programming language, with objects, lists, data structures and control structures," Fenves says. For OpenSees, Tcl has thus become the research group's interface of choice. Some students have also experimented with GUIs, but Fenves says that this is really the domain of the commercial vendors. "GUI technology changes very rapidly, and students don't have the skills or time to keep up. " GUIs are also very OS-specific-making the code difficult to maintain. He hopes that, long-term, that the commercial world will "package" the research models and algorithms by adding a GUI, support, documentation and quality assurance.
Surprisingly, OpenSees lacks a conventional output file with pre-formatted rows and columns showing the analysis results. Fenves had expected to hear a lot of complaints about that (he didn't), but argues that output files don't scale well to bigger models. "Instead, we make use of recorder objects, which are usually just interfaces to a MySQL database. If you are shaking something and want to know the strain, you put in a recorder object and query the data. We are now working on another database repository that is part of a national effort, with recorders that output XML metadata."
OpenSees has now been around long enough for Fenves to assess its influence. "Twice a year, I do a 'Google metric' to see how many hits we get compared to the commercial networks. Interest has been growing."
Off-the-shelf vs. roll-your-own
Fenves says that the codes used for earthquake simulation have come either from the commercial sector or have emerged from a haphazard development process within academia. Commercial applications like Ansys, Abacus, Adina, and the structural engineering program SAP200 have some advantages: they are tested, documented and supported. "For design engineers with a job to do, these are good solutions, but they are not adequate for cutting-edge research because the modeling capability is not enough. Large models are at the core of our research and the commercial packages aren't equipped to handle them. The same is true for the testing of new algorithms-commercial code, being closed source, can't be modified to include them."
Researcher-developed code is more accommodating of experimentation, but in the past, it has run into a different sort of problem: a lack of coordinated development and no code repository. "Structurally, development has too often branched at the root, never to re-converge. With some widely used applications, there could be 200 versions out there, with some unable to interoperate with others. With the open source model, there's a whole process involved. People communicate about who is working on what, and someone takes responsibility for the repository." The OpenSees website has linked some of the basics and culture of open source, creating a primer for interested students. There are links to Eric Raymond's seminal essay "The Cathedral and the Bazaar," the Apache project, GNU, Red Hat Linux, and the Open Source Initiative-the non-profit corporation behind the movement.
When PEER was established in 1997, OpenSees fit right in to the project's overall goal of "performance-based" earthquake engineering. The term defines broader criteria for designing structures to withstand earthquakes. Today, earthquake design is prescriptive-you design structures to meet the building codes, and if they pass, you have done your job. "Building codes are there to save lives, but they don't say anything about whether the building can be used afterward, the damage it might sustain, or the cost of repair," Fenves says. "Nor do the codes say whether there will be loss of life due to non-structural components: ceilings falling, pipes breaking. Performance-based engineering looks into all of this. It means identifying performance goals-not just building codes-then designing a structure to meet those goals. The key to performance-based engineering is the ability to predict what will happen to a given structural design given a certain type of earthquake."
An inevitable byproduct of performance-based engineering is that it puts more "stress" on the simulation software. The creators of commercial packages don't worry about performance-based goals, "because an engineer is not going to use a model that is more sophisticated than is needed to satisfy the code Right now, if a commercial firm actually added sophisticated models, I don't think anyone would use them. The design process needs to change, first." That means not just crunching numbers with existing software, but writing entirely new code. "The fidelity of our models far exceeds what's available commercially, with the ability to replicate experimental data throughout a full range of loads. Whether we are looking at soil models, reinforced concrete models, or non-linear solution methods-our solvers are much more capable than what's available commercially."
When combined, these various software models can lead to surprising results. That was the case for the analysis of a bridge near the California-Oregon border, conducted by a research group at the University of California, San Diego. Fenves thinks it may be the most advanced model of a bridge ever created. When researchers ran the simulation, the results predicted that soil on the hill could slump and liquefy, squeezing the bridge together. "People have talked about that happening, but the model actually showed us that if the earthquake is big enough, with the right site conditions, the soil slides in and the bridge follows. You couldn't model this on a shake table." Nor could you do so in commercial software, because the building codes don't require it.
The shake table in Richmond
After leaving Professor Fenves's office, I drove a few miles west to the university's Richmond Field Station, home to the Pacific Earthquake Engineering Research Center, itself. The site is near the Interstate 80 freeway, which runs alongside the muddy shoreline of the San Francisco Bay. Except for the nearby traffic, the place was quiet-unlike in 1860 when the California Blasting Cap Company did business here.
On the day I visited, a group of researchers wearing hard hats had gathered around PEER's shake table-one of [GF1]the largest in the United States. Stephen Mahin, professor of structural engineering, explained that the group was testing seismic isolators-a kind of structural shock absorber. The technology is expensive and is more commonly used in Japan than in the United States. Above the table, the isolators were mounted on a simple structure that resembled the scaled-down skeletal base of a skyscraper. "Here, we are looking at putting isolators at the top of the first floor columns, Mahin said. "This configuration would make them more palatable for construction in the U.
I asked Mahin about the relationship between this kind of physical simulation and the software counterpart. "We integrate the two," he said. "We can model some parts quite well analytically, often because we have lots of data. But in other cases, we are looking at new concepts, so that only experimental testing will do. Many of the things we test here are scale-dependent-scaled down structures don't necessarily deliver results we can believe. At the same time, even a detailed finite element analysis doesn't convince everybody that it is reflecting the real world. Physical simulation is an exercise in humility."
Sometimes, the two types of simulation are combined into a single hybrid analysis. "Say you have a big structure like a suspension bridge, and you want to analyze one of the bridge piers. So you use an actual pier, then model the rest of it in software: the tower, deck, cables, foundations, sea bed, and wave propagation." The FEA application operates as it always does, but every time data is needed on the base pier, an actual physical structure gets moved the specified distance and the actual resulting force is returned to the program. "So on this component, instead of having finite elements, we have physical elements." Mahin said that this day's physical simulation was, itself, a preliminary test for later hybrid testing. "The shake table test represents what the structure would do in an earthquake. We are then going to move the specimen to another building and hook it up to large, computer-controlled hydraulic actuator that see if we can replicate what we saw here."
Inside the control room, a technician had his eyes on a computer screen and his hand, over his shoulder, poised on a dial. A flick of that dial would trigger a set of 75,000-pound hydraulic actuators, eight horizontal and four vertical, built into a pit below the table-creating an "earthquake" on demand. As if opening a safe, he pulsed the dial, and it took me a split second before I realized the obvious: that the shaking would not be confined to the table. For a few nanoseconds, my native California brain, with its first-hand knowledge of earthquakes, insisted that this was the real thing.
Sidebar: Japan and the U.S.: Neighbors on the Ring of Fire.
The largest quake in United States history took place not in California, but 1,500 miles to the east, in Missouri. The year was 1811. Professor Gregory Fenves points out that some 30 U.
But the Pacific Ring of Fire is where most earthquakes take place, and so it is not surprising that for PEER researchers, Japan is right next door. "Our biggest research collaborators are in Japan," Fenves says. "Japan has the largest shake table now, at the E-Defense facility in Miki City just outside Kobe, inaugurated a year and a half ago on the anniversary of the Kobe earthquake. The research director, Dr. Masayoshi Nakashima of Kyoto University is a close friend. I go to Japan once or twice a year, and Japanese researchers often come to Berkeley. The collaboration between our two countries has been ongoing for 25 years." (To see the shake table in action, check out the E-Defense website: www.
The earthquake codes are also similar. "Japan legislates building codes on the Federal level while in the U.
Hybrid testing techniques have also brought the two locations together. The "hardware in the loop" of a software simulation can be located anywhere within reach of a data network, even across the Pacific. This November, for example, portions of a bridge will be physically tested at PEER and in Kyoto, with software analysis done at the Tokyo Institute of Technology. That project will lead to a larger one: a very large, full-size shaking table test at the Miki city facility.