Lei Li

and 3 more

Seismology focuses on the study of earthquakes and associated phenomena to characterize seismic sources and Earth structure, which both are of immediate relevance to society. This article is composed of two independent views on the state of the ICON principles (Goldman et al., 2021) in seismology and reflects on the opportunities and challenges of adopting them from a different angle. Each perspective focuses on a different topic. Section 1 deals with the integration of multiscale and multidisciplinary observations, focusing on integrated and open approaches, whereas Section 2 discusses computing and open-source algorithms, reflecting coordinated, networked, and open principles. In the past century, seismology has benefited from two co-existing technological advancements - the emergence of new, more capable sensory systems and affordable and distributed computing infrastructure. Integrating multiple observations is a crucial strategy to improve the understanding of earthquake hazards. However, current efforts in making big datasets available and manageable lack coherence, which makes it challenging to implement initiatives that span different communities. Building on ongoing advancements in computing, machine learning algorithms have been revolutionizing the way of seismic data processing and interpretation. A community-driven approach to code management offers open and networked opportunities for young scholars to learn and contribute to a more sustainable approach to seismology. Investing in new sensors, more capable computing infrastructure, and open-source algorithms following the ICON principles will enable new discoveries across the Earth sciences.

Lei Li

and 2 more

Seismic source locations provide fundamental information on earthquakes and lay the foundation for seismic monitoring at all scales. Subsurface engineering operations, such as hydraulic fracturing, usually generate abundant microseismic events with low signal-to-noise ratios, thus raising a higher demand on the reliability and computational efficiency of seismic location methods. Stacking-based method involves reconstructing and focusing the radiated seismic source energy with a certain stacking operator, for example, the diffraction stacking operator or the cross-correlation stacking operator. They are noise-resistant, automatic, and data-driven. The source locations are resolved as images instead of discrete dots, offering more insights into source processes and surrounding structures. In this work, we conduct the performance evaluation and influential factor analysis with synthetic examples, to further improve the understanding of the method. Three categories of factors are investigated, including the velocity model, array geometry, and waveform complexity. Each of the three factors consists of several detailed factors, such as different array types and noise levels. Multiple parameters are considered for the performance evaluation, including location error/bias, imaging resolution, signal alignment with time shifts, and the computational cost. The proposed scheme is also applied to field microseismic datasets to demonstrate its feasibility. This study will be conducive to the design and evaluation of surface monitoring projects.