Jack B. Muir

False positives are common in single-station template matching

Many signals of interest to seismologists are so small that they cannot be easily seen on seismograms. In order to identify these signals, seismologists have developed the technique of template matching, which takes a large signal and runs it over a seismogram. If the template signal matches the seismogram under a certain mathematical definition, then we consider it to be a match, and we add that part of the seismogram to the catalogue of signals. Normally, seismologists cross-check this process using multiple seismograms recorded at different instruments, but this is not necessarily possible on other planets where it is too expensive to deploy many seismometers. Without this cross-checking, it is possible that many of the “matches” are in fact false positives. We performed a statistical experiment to show that these false positives are in fact likely to be quite common, which means that we must be careful when handling template matching with single seismometers.

A Deep Gaussian Process Model for Seismicity Background Rates

As far as we know, accurately predicting earthquakes is impossible. However, in a statistical sense, we can produce good forecasts of earthquake activity. Forecasts give us a prediction of the chance of an earthquake happening in at some time. One of the most successful short-term forecasting algorithms is called ETAS (epidemic-type aftershock sequence). ETAS combines a self-excitation mechanism (earthquakes triggering more earthquakes) with a steady production of earthquakes from some background process. By fitting the forecasting model to observed data, we can both predict probabilities forward in time and also understand the nature of the earthquake sequence up to the observation time. The background process gives us great insight into what causing earthquake activity in the first place, but it is very difficult to separate the observed earthquakes into triggered and background events. In this study, we investigated the use of deep Gaussian processes as a means to perform this separation, and found that they efficiently and robustly allow us to extract the background rate.

Long-wavelength topography and multiscale velocity heterogeneity at the core-mantle boundary

The most important internal boundary of the Earth is the core-mantle boundary (CMB), between the liquid iron outer core and the rocky mantle above it. The shape of this boundary has important implications for how the Earth has evolved through time, in particular due to being an indication about how heat flows out of the core and into the mantle. However, determining this shape has remained stubbornly difficult for the past 40 years. In this paper, we use a dataset that is particularly targeted towards this region to image the CMB shape and the lowermost mantle using a robust statistical method developed in our previous work. We find that the CMB boundary has large hills and valleys, but is relatively smooth compared to the complex structure of the lowermost mantle above it.

Wavefield-based evaluation of DAS instrument response and array designs

Distributed acoustic sensing (DAS) networks turn fibre optic cable into highly dense arrays of seismometers. However, rather than measuring the ground motion directly, like a normal seismometer, DAS measures ground strain. Because of this, it is quite difficult to build up a full picture of ground motion in an area using only DAS. We use our new signal processing framework to convert DAS records from measuring strain into measuring displacement motion, and show how this allows us to combine them with normal seismometers. This has useful applications for calibrating DAS, designing DAS arrays for optimal sensitivity, and compressing the data.

Parsimonious velocity inversion applied to the Los Angeles Basin, CA

Los Angeles is one of the US cities with the highest risk of damage due to earthquakes, due to the large number of nearby active faults and its location on a deep bowl of weak rock, which tends to amplify earthquake damage. We use a large number of instruments located in Los Angeles district schools to make measurements of earthquakes that occurred near Ridgecrest, California in July 2019. These earthquakes generated strong Love waves - a type of energy that is particularly useful for studying the basin structures responsible for amplification of earthquakes. Using this data, we applied a new imaging technique based on perturbations of identified geological surfaces to create a local model of the northeast Los Angeles basin at higher resolution than had been previously available. Our imaging technique appropriately balances information from previous, lower resolution inversions with the new data obtained in this study, and identifies which parts of the model are most robust.

Sub-kilometer correlation between near-surface structure and ground motion measured with distributed acoustic sensing

When earthquakes shake the ground, one of the most important factors for the amount of damage is the earth structure immediately underneath a building or piece of infrastructure. Weak, seismically slow soils amplify shaking and cause greater damage. Mapping out this structure at high resolution is challenging however, as seismic instruments are expensive and hence normally only sparsely distrubuted. Using distributed acoustic sensing (DAS), we can get much finer scale resolution. In this study, I contributed tomography using our level-set approach which helped to show a that DAS is strongly sensitive to near-site structure, and can create earth models which successfully predict observed ground motion amplifications. This has the potential to revolutionize geophysical surveying for seismic hazard.

HypoSVI - Hypocentral Earthquake Location Analysis using Machine Learning based Stein Variational Gradient Descent

One of the most important things to know about earthquakes is where they happened, how deep, underneath where exactly, and what time. However, given that seismic data is noisy, the structure of the Earth is generally unknown, and we don't have very many stations, figuring out where earthquakes occured is surprisingly difficult. Indeed, the best answer is a probability distribution over the earthquake's location. Probability calculations can be quite expensive, but in this paper we use the interesting ensemble algorithm of Stein Variational Gradient Descent, which fits a ensemble variational approximation our uncertainty. This provides an efficient way of mapping earthquake locations.

Seismic wavefield reconstruction using a preconditioned wavelet-curvelet compressive sensing approach

Seismic data is normally recorded on a per-channel basis - each instrument writes out a file with the motion of the earth recorded at constantly sampled times. However, we know that the motion satisfies the seismic wave equation, so there should be a way to optimally interpolate all of these recordings in space. We use compressive-sensing, a technique developed for machine vision, to create an optimal interpolation scheme for seismic data that respects wave physics and so allows us to perform innovative new analyses of the seismic wavefield.

Probabalistic lowermost mantle P-Wave tomography from hierarchical Hamiltonian Monte Carlo and model parametrization cross-validation

The structure of the Earth is dominated by three primary interfaces –- the surface, the Core-Mantle Boundary (CMB), and the Inner Core boundary. The Core-Mantle Boundary is potentially the most interesting of these, because it strongly controls the heat flux from the core, and hence has wide ranging implications for flow in the mantle and the generation of Earth's magnetic field. The area of the mantle just above the CMB is known as the lowermost mantle and is particularly complex. In this study, we used recent developments in imaging science to provide constraints on just how complex the lowermost mantle is, using a unique dataset that was particularly sensitive to short lengthscale structures. We found that our data required increased short lengthscale complexity compared to existing datasets, suggesting a "patchier" lowermost mantle than is often assumed.

Geometric and Level Set Tomography using Ensemble Kalman Inversion

Tomography refers to the process by which we infer the hidden internal structure of an object by sending energy through it, rather than by destructive testing. Most people would be familiar with the medical CT scan (computed tomography) which uses the attenuation of x-rays through the body to create detailed, non-invasive images. We can use a similar idea to create images of the interior of the earth using seismic waves; however because the imaging geometry and propagation physics are much less favorable than in the medical setting, seismic images are much harder to interpret. We propose a method of simplifying the way that seismic images are described during tomography, to make them more robust and interpretable. We use a mixture of explicit geological features such as faults, and boundaries between geological units described implicitly by a mathematical object known as a level set. This allows us to image sharp features much more cleanly than using traditional seismic techniques.

Did Oldham Discover the Core After All? Handling Imprecise Historical Data with Hierarchical Bayesian Model Selection Methods

One of the most important developments of early 20th century Earth science was determination that the Earth was made up of multiple layers with wildly different physical characteristics. Probably the most important study during this period was R.D. Oldham's 1906 paper, which synthesised observations of multiple earthquakes at seismometers across the world to build a comprehensive picture of how long it took seismic waves to travel through the deep Earth. On the basis of analysing this data, Oldham correctly proposed that the Earth had a core, distinct from the material above it. However, it turned out that while Oldham's conclusions were correct, his analysis was totally wrong, and in fact used data that had nothing to do with the core –- however, part of his data may have been sufficient to "discover" the core correctly, had Oldham analysed it appropriately. We use this significant (and fun) historical dataset to develop new statistical techniques for treating historical seismic data, which is often riddled with imprecision and errors. In doing so, we found that Oldham did in fact have sufficient data to discover the core after discarding his incorrect analyses, and provided a useful tool to the geophysical community for treating historical data.

Rayleigh-Wave H/V via Noise Cross Correlation in Southern California

Ambient Noise Cross-Correlation is a technique that allows us to use the parts of seismograms that we would normally throw away to simulate an earthquake located at any seismometer. As such, it has become a very popular technique, especially in the study of Rayleigh waves, which are seismic waves that are trapped along the surface. These waves have a characteristic rolling motion — any point in a Rayleigh wave moves in an oval. The ratio of the long vs short axis of this oval (the H/V ratio) turns out to be very sensitive to the structure of the Earth near the surface. We calculated H/V ratios for all of the stations in Southern California, which gave use a new dataset constraining the near surface of this important, earthquake-prone region.

Strong, Multi-Scale Heterogeneity in Earth’s Lowermost Mantle

The lowermost mantle refers to the region of the Earth, roughly half-way towards its center, where normal rock gives way to the liquid iron-nickel alloy of the outer core. Because of this extreme transition, it is one of the most interesting parts of the Earth. Previous robust studies of the area have mostly used seismic data which averages over large patches of the lowermost mantle, leading to an image of the lowermost mantle dominated by two large features known as low velocity provinces. We use data with sharper sensitivity to show that the picture in the lowermost mantle is in fact much more complicated, and that the low velocity provinces are less distinct than the previous studies had suggested.

A method of spherical harmonic analysis in the geosciences via hierarchical Bayesian inference

Spherical harmonic analysis turns data defined by coordinates on the surface of a sphere (like the Earth, approximately) into a representation based on resonant frequencies, which is useful for many types of further studies. There are fast ways to do this if the data is regularly distributed on the sphere, but for irregular distributions spherical harmonic analysis is quite hard. We developed a new statistical method for spherical harmonic analysis using random sampling to produce uncertainty estimates of the transform.

A single-probe-beam double-heterodyne polarimeter-interferometer for plasma Faraday rotation measurements

Commercial nuclear fusion has long been a holy grail of clean energy production, promising to totally replace baseline fossil fuel generators and providing a useful counterbalance to fickle wind and solar. However, despite much promise, fusion has proven to be an engineering nightmare, and so has yet to realize its potential. One of the key considerations for fusion production is accurate metrics of the internal state of the plasma inside the generator. The plasma is the burning fuel of fusion, analagous to ignited gasoline in an internal combustion engine. As an undergraduate research project, we developed a novel microwave-based sensor for determining the physical properties of a test plasma. Of note was that we were an early adopter of 3D printing for custom scientific equipment, in this case a particular microwave-optical element of the sensor.