• No results found

Chapter 2 : Literature Review

2.2 Earthquake Data Considerations

Usually, engineers use simplified methods for the design and assessment of structures, such as the response spectra method or code-based guidelines. However, some complex situations require fully dynamic analyses for verification, for example, buildings designed for high ductility levels, structures with highly irregular configurations or critical structures that, if disrupted, could have grave impacts. In these situations, the earthquake loading needs to be represented by acceleration time histories. As such, the criteria for selection of the suite of strong-motion to be used in these analyses can be critical. Typically, acceleration records that are considered of engineering interest, meaning those which could cause inelastic structural response, are those coming from moderate events (5 ≤ 𝑀𝑤 < 7), major events (7 ≤ 𝑀𝑤 < 8), or great earthquakes (𝑀𝑤 ≥ 8).

Figure 2.7: Magnitude-distance distribution of strong-motion records in the NGA West2 database (Ancheta et al., 2013).

18 Many researchers consider magnitude as an important, or at least initial, criterion for use in the selection of earthquake ground motions since it has been established as having significant influence on response parameters, such as amplitude and duration of strong motion (Katsanos et al., 2010). However, there is a comparative paucity of high magnitude acceleration records, compared to those from lower magnitude events, which is compounded by a further reduced availability of records close to the source (See Figure 2.7).

There are three types of acceleration time histories that an engineer could use: spectrum- compatible accelerograms based upon real historical records, fully synthetic accelerograms, and real accelerograms that are untouched (with the exception of necessary processing adjustments). The first category is appealing because it makes it possible to obtain acceleration time series with a response spectrum that matches a particular elastic design spectrum in the entire period range with small error. However, concerns have been raised about the use of such types of records in nonlinear analyses, such as “excessive number of cycles of strong motion,” meaning a longer duration of strong motion, and “unreasonably high energy content” (Bommer & Acevedo, 2004).

On the other hand, others have said that using a larger group of real acceleration to match the design spectrum would give the same results as using the artificial accelerograms (Priestley et al., 2007). The second category includes records generated from seismological source models that consider path and site effects. Using this type of accelerograms would require the engineer to identify the many parameters necessary to characterize the earthquake source, which are usually attributed high uncertainties, and its use would require expert knowledge (Bommer & Acevedo, 2004). The third category is the use of real accelerograms that have been recorded during earthquake events. Their use in seismic analyses is widely accepted since the database of recorded records continues to increase and records are mostly easily accessible and available for use.

19 However, the way in which they are selected, processed and manipulated are important if the characteristics of the original record are to be preserved and to ensure that the motions are representative of site conditions, which may be important depending on the goal of the analyses.

As mentioned previously, there are many different databases of earthquake ground motions. These databases usually present the acceleration data organized by different categories including magnitude, focal depth, source-to-site distance, and site conditions. One of the first global earthquake databases was the one presented by Leeds in 1992, which included approximately 400 horizontal component records mainly originating from the U.S., Japan, and Mexico (Bommer & Acevedo, 2004). Following this, many other databases have been created, some focusing on geographical areas, and others later including records for specific tectonic regimes (i.e. shallow crustal or subduction zones). Issues have been identified for different databases including lack of uniformity in the information provided, such as different magnitude scales (e.g., MS, ML, or Mw) and missing information for a portion of the records, which makes them unreliable and can increase uncertainties if used for seismic hazard analysis (Douglas, 2003; Bommer & Acevedo, 2004). In this sense, the PEER database, specifically the NGA West2 database, presents a stark contrast to the previous lack of consensus in ground motion databases. The NGA West2 includes data collected for a particular tectonic regime, shallow crustal earthquakes, accounting for global and California earthquakes, as well as a catalog summarizing intensity measures and related metadata (i.e., PGA, site parameters). The acceleration time series for all records provided in the database are uniformly processed, using an acausal Butterworth filter (Ancheta et al., 2014). Beyer and Bommer (2007) identify desirable properties that records should have to be suitable for use in bidirectional analysis of structures. First, the two components need to be processed and filtered on an individual basis following the same parameters, and they

20 give preference to acausal filters that do not alter the phase spectra. Second, the leading zeroes should be of equal length in both horizontal components. Based on this, the records from the NGA West2 database meet these requirements for use in bidirectional analysis.

Furthermore, it is also worth noting that most seismic codes do not distinguish between record selection for unidirectional and bidirectional dynamic analysis, and are considered to present a very simplified approach for the assessment of seismic loads (Katsanos et al., 2010). Some authors agree and find that guidelines provided are often inconsistent or vague with respect to the assumptions taken for the selection and processing of motions for bidirectional analysis (Beyer & Bommer, 2007).

In sum, when performing nonlinear time history analysis for verification of simplified design methods, such as DDBD, the type of data used (whether real or artificial records) need to be consistent with the purpose of the analyses.

Related documents