Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

Make MTH5 from IRIS Data Managment Center v0.1.0

This example demonstrates how to build an MTH5 from data archived at IRIS, it could work with any MT data stored at an FDSN data center (probably).

We will use the mth5.clients.FDSN class to build the file. There is also second way using the more generic mth5.clients.MakeMTH5 class, which will be highlighted below.

Note: this example assumes that data availability (Network, Station, Channel, Start, End) are all previously known. If you do not know the data that you want to download use IRIS tools to get data availability.

from pathlib import Path

import numpy as np
import pandas as pd
from mth5.mth5 import MTH5
from mth5.clients import FDSN, MakeMTH5

from matplotlib import pyplot as plt
%matplotlib widget

Set the path to save files to as the current working directory

default_path = Path().cwd()

Initialize a MakeMTH5 object

Here, we are setting the MTH5 file version to 0.1.0 so that we can only have one survey in a single file. Also, setting the client to “IRIS”. Here, we are using obspy.clients tools for the request. Here are the available FDSN clients.

Note: Only the “IRIS” client has been tested.

fdsn_object = FDSN(mth5_version='0.1.0')
fdsn_object.client = "IRIS"

Make the data inquiry as a DataFrame

There are a few ways to make the inquiry to request data.

  1. Make a DataFrame by hand. Here we will make a list of entries and then create a DataFrame with the proper column names

  2. You can create a CSV file with a row for each entry. There are some formatting that you need to be aware of. That is the column names and making sure that date-times are YYYY-MM-DDThh:mm:ss

Column NameDescription
networkFDSN Network code (2 letters)
stationFDSN Station code (usually 5 characters)
locationFDSN Location code (typically not used for MT)
channelFDSN Channel code (3 characters)
startStart time (YYYY-MM-DDThh:mm:ss) UTC
endEnd time (YYYY-MM-DDThh:mm:ss) UTC
channels = ["LFE", "LFN", "LFZ", "LQE", "LQN"]
CAS04 = ["8P", "CAS04",  '2020-06-02T19:00:00', '2020-07-13T19:00:00'] 

request_list = []
for entry in [CAS04]:
    for channel in channels:
        request_list.append(
            [entry[0], entry[1], "", channel, entry[2], entry[3]]
        )

# Turn list into dataframe
request_df =  pd.DataFrame(request_list, columns=fdsn_object.request_columns) 
request_df
Loading...

Save the request as a CSV

Its helpful to be able to save the request as a CSV and modify it and use it later. A CSV can be input as a request to MakeMTH5

request_df.to_csv(default_path.joinpath("fdsn_request.csv"))

Get only the metadata from IRIS

It can be helpful to make sure that your request is what you would expect. For that you can request only the metadata from IRIS. The request is quick and light so shouldn’t need to worry about the speed. This returns a StationXML file and is loaded into an obspy.Inventory object.

inventory, data = fdsn_object.get_inventory_from_df(request_df, data=False)

Have a look at the Inventory to make sure it contains what is requested.

inventory
Inventory created at 2026-01-10T04:19:50.934478Z Created by: ObsPy 1.4.1 https://www.obspy.org Sending institution: MTH5 Contains: Networks (1): 8P Stations (1): 8P.CAS04 (Corral Hollow, CA, USA) Channels (8): 8P.CAS04..LFZ, 8P.CAS04..LFN, 8P.CAS04..LFE, 8P.CAS04..LQN (2x), 8P.CAS04..LQE (3x)

Make an MTH5 from a request

Now that we’ve created a request, and made sure that its what we expect, we can make an MTH5 file. The input can be either the DataFrame or the CSV file.

We are going to time it just to get an indication how long it might take. Should take about 4 minutes.

Note: we are setting interact=False. If you want to just to keep the file open to interogat it set interact=True.

Make an MTH5 using MakeMTH5

Another way to make a file is using the mth5.clients.MakeMTH5 class, which is more generic than FDSN, but doesn’t have as many methods. The MakeMTH5 class is meant to be a convienence method for the various clients.

from mth5.clients import MakeMTH5

make_mth5_object = MakeMTH5(mth5_version='0.1.0', interact=False)
mth5_filename = make_mth5_object.from_fdsn_client(request_df, client="IRIS")
%%time

mth5_filename = MakeMTH5.from_fdsn_client(request_df, interact=False, **{"mth5_version": "0.1.0"})

print(f"Created {mth5_filename}")
2026-01-09T21:13:20.126262-0800 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | line: 140 | Converting PoleZerosResponseStage electric_si_units to a CoefficientFilter.
2026-01-09T21:13:20.131262-0800 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | line: 140 | Converting PoleZerosResponseStage electric_dipole_92.000 to a CoefficientFilter.
2026-01-09T21:13:20.154056-0800 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | line: 140 | Converting PoleZerosResponseStage electric_si_units to a CoefficientFilter.
2026-01-09T21:13:20.159402-0800 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | line: 140 | Converting PoleZerosResponseStage electric_dipole_92.000 to a CoefficientFilter.
2026-01-09T21:13:20.182189-0800 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | line: 140 | Converting PoleZerosResponseStage electric_si_units to a CoefficientFilter.
2026-01-09T21:13:20.189784-0800 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | line: 140 | Converting PoleZerosResponseStage electric_dipole_92.000 to a CoefficientFilter.
2026-01-09T21:13:20.209130-0800 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | line: 140 | Converting PoleZerosResponseStage electric_si_units to a CoefficientFilter.
2026-01-09T21:13:20.209130-0800 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | line: 140 | Converting PoleZerosResponseStage electric_dipole_92.000 to a CoefficientFilter.
2026-01-09T21:13:20.245426-0800 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | line: 140 | Converting PoleZerosResponseStage electric_si_units to a CoefficientFilter.
2026-01-09T21:13:20.249975-0800 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | line: 140 | Converting PoleZerosResponseStage electric_dipole_92.000 to a CoefficientFilter.
2026-01-09T21:13:21.694026-0800 | INFO | mth5.mth5 | _initialize_file | line: 678 | Initialized MTH5 0.1.0 file c:\Users\peaco\OneDrive\Documents\GitHub\mth5\docs\examples\notebooks\8P_CAS04.h5 in mode w
2026-01-09T21:13:32.776473-0800 | INFO | mth5.groups.base | _add_group | line: 330 | RunGroup a already exists, returning existing group.
2026-01-09T21:13:44.869301-0800 | WARNING | mth5.timeseries.run_ts | _validate_array_list | line: 518 | Station ID CAS04 from ChannelTS does not match original station ID {self.station_metadata.id}. Updating ID to match.
2026-01-09T21:13:44.907185-0800 | WARNING | mth5.timeseries.run_ts | validate_metadata | line: 1035 | start time of dataset 2020-06-02T19:00:00+00:00 does not match metadata start 2020-06-02T18:41:43+00:00 updating metatdata value to 2020-06-02T19:00:00+00:00
2026-01-09T21:16:10.904821-0800 | WARNING | mth5.timeseries.run_ts | validate_metadata | line: 1035 | start time of dataset 2020-06-02T19:00:00+00:00 does not match metadata start 2020-06-02T18:41:43+00:00 updating metatdata value to 2020-06-02T19:00:00+00:00
2026-01-09T21:16:39.892484-0800 | INFO | mth5.groups.base | _add_group | line: 330 | RunGroup b already exists, returning existing group.
2026-01-09T21:16:43.253589-0800 | WARNING | mth5.timeseries.run_ts | _validate_array_list | line: 518 | Station ID CAS04 from ChannelTS does not match original station ID {self.station_metadata.id}. Updating ID to match.
2026-01-09T21:16:51.672639-0800 | INFO | mth5.groups.base | _add_group | line: 330 | RunGroup c already exists, returning existing group.
2026-01-09T21:16:55.031399-0800 | WARNING | mth5.timeseries.run_ts | _validate_array_list | line: 518 | Station ID CAS04 from ChannelTS does not match original station ID {self.station_metadata.id}. Updating ID to match.
2026-01-09T21:16:59.967894-0800 | INFO | mth5.groups.base | _add_group | line: 330 | RunGroup d already exists, returning existing group.
2026-01-09T21:17:03.083336-0800 | WARNING | mth5.timeseries.run_ts | _validate_array_list | line: 518 | Station ID CAS04 from ChannelTS does not match original station ID {self.station_metadata.id}. Updating ID to match.
2026-01-09T21:17:03.125959-0800 | WARNING | mth5.timeseries.run_ts | validate_metadata | line: 1045 | end time of dataset 2020-07-13T19:00:00+00:00 does not match metadata end 2020-07-13T21:46:12+00:00 updating metatdata value to 2020-07-13T19:00:00+00:00
2026-01-09T21:17:03.274681-0800 | WARNING | mth5.timeseries.run_ts | validate_metadata | line: 1045 | end time of dataset 2020-07-13T19:00:00+00:00 does not match metadata end 2020-07-13T21:46:12+00:00 updating metatdata value to 2020-07-13T19:00:00+00:00
2026-01-09T21:17:09.326718-0800 | INFO | mth5.mth5 | close_mth5 | line: 772 | Flushing and closing c:\Users\peaco\OneDrive\Documents\GitHub\mth5\docs\examples\notebooks\8P_CAS04.h5
Created c:\Users\peaco\OneDrive\Documents\GitHub\mth5\docs\examples\notebooks\8P_CAS04.h5
CPU times: total: 41.3 s
Wall time: 4min 37s
# open file already created
mth5_object = MTH5()
mth5_object.open_mth5(mth5_filename)
/: ==================== |- Group: Survey ---------------- |- Group: Filters ----------------- |- Group: coefficient --------------------- |- Group: electric_analog_to_digital ------------------------------------ |- Group: electric_dipole_92.000 -------------------------------- |- Group: electric_si_units --------------------------- |- Group: magnetic_analog_to_digital ------------------------------------ |- Group: fap ------------- |- Group: fir ------------- |- Group: time_delay -------------------- |- Group: electric_time_offset ------------------------------ |- Group: hx_time_offset ------------------------ |- Group: hy_time_offset ------------------------ |- Group: hz_time_offset ------------------------ |- Group: zpk ------------- |- Group: electric_butterworth_high_pass_30000 ---------------------------------------------- --> Dataset: poles .................... --> Dataset: zeros .................... |- Group: electric_butterworth_low_pass --------------------------------------- --> Dataset: poles .................... --> Dataset: zeros .................... |- Group: magnetic_butterworth_low_pass --------------------------------------- --> Dataset: poles .................... --> Dataset: zeros .................... |- Group: Reports ----------------- |- Group: Standards ------------------- --> Dataset: summary ...................... |- Group: Stations ------------------ |- Group: CAS04 --------------- |- Group: Features ------------------ |- Group: Fourier_Coefficients ------------------------------ |- Group: Transfer_Functions ---------------------------- |- Group: a ----------- --> Dataset: ex ................. --> Dataset: ey ................. --> Dataset: hx ................. --> Dataset: hy ................. --> Dataset: hz ................. |- Group: b ----------- --> Dataset: ex ................. --> Dataset: ey ................. --> Dataset: hx ................. --> Dataset: hy ................. --> Dataset: hz ................. |- Group: c ----------- --> Dataset: ex ................. --> Dataset: ey ................. --> Dataset: hx ................. --> Dataset: hy ................. --> Dataset: hz ................. |- Group: d ----------- --> Dataset: ex ................. --> Dataset: ey ................. --> Dataset: hx ................. --> Dataset: hy ................. --> Dataset: hz ................. --> Dataset: channel_summary .............................. --> Dataset: fc_summary ......................... --> Dataset: tf_summary .........................

Have a look at the contents of the created file

mth5_object
/: ==================== |- Group: Experiment -------------------- |- Group: Reports ----------------- |- Group: Standards ------------------- --> Dataset: summary ...................... |- Group: Surveys ----------------- |- Group: CONUS_South --------------------- |- Group: Filters ----------------- |- Group: coefficient --------------------- |- Group: electric_analog_to_digital ------------------------------------ |- Group: electric_dipole_92.000 -------------------------------- |- Group: electric_si_units --------------------------- |- Group: magnetic_analog_to_digital ------------------------------------ |- Group: fap ------------- |- Group: fir ------------- |- Group: time_delay -------------------- |- Group: electric_time_offset ------------------------------ |- Group: hx_time_offset ------------------------ |- Group: hy_time_offset ------------------------ |- Group: hz_time_offset ------------------------ |- Group: zpk ------------- |- Group: electric_butterworth_high_pass_30000 ---------------------------------------------- --> Dataset: poles .................... --> Dataset: zeros .................... |- Group: electric_butterworth_low_pass --------------------------------------- --> Dataset: poles .................... --> Dataset: zeros .................... |- Group: magnetic_butterworth_low_pass --------------------------------------- --> Dataset: poles .................... --> Dataset: zeros .................... |- Group: Reports ----------------- |- Group: Standards ------------------- --> Dataset: summary ...................... |- Group: Stations ------------------ |- Group: CAS04 --------------- |- Group: Features ------------------ |- Group: Fourier_Coefficients ------------------------------ |- Group: Transfer_Functions ---------------------------- |- Group: a ----------- --> Dataset: ex ................. --> Dataset: ey ................. --> Dataset: hx ................. --> Dataset: hy ................. --> Dataset: hz ................. |- Group: b ----------- --> Dataset: ex ................. --> Dataset: ey ................. --> Dataset: hx ................. --> Dataset: hy ................. --> Dataset: hz ................. |- Group: c ----------- --> Dataset: ex ................. --> Dataset: ey ................. --> Dataset: hx ................. --> Dataset: hy ................. --> Dataset: hz ................. |- Group: d ----------- --> Dataset: ex ................. --> Dataset: ey ................. --> Dataset: hx ................. --> Dataset: hy ................. --> Dataset: hz ................. --> Dataset: channel_summary .............................. --> Dataset: fc_summary ......................... --> Dataset: tf_summary .........................

Channel Summary

A convenience table is supplied with an MTH5 file. This table provides some information about each channel that is present in the file. It also provides columns hdf5_reference, run_hdf5_reference, and station_hdf5_reference, these are internal references within an HDF5 file and can be used to directly access a group or dataset by using mth5_object.from_reference method.

Note: When a MTH5 file is close the table is resummarized so when you open the file next the channel_summary will be up to date. Same with the tf_summary.

ch_df = mth5_object.channel_summary.to_dataframe()
ch_df
Loading...

Have a look at a station

Lets grab one station CAS04 and have a look at its metadata and contents. Here we will grab it from the mth5_object.

cas04 = mth5_object.get_station("CAS04")
cas04.metadata
{ "station": { "acquired_by.author": "", "channel_layout": "X", "channels_recorded": [ "ex", "ey", "hx", "hy", "hz" ], "data_type": "MT", "fdsn.id": "CAS04", "geographic_name": "Corral Hollow, CA, USA", "hdf5_reference": "<HDF5 object reference>", "id": "CAS04", "location.datum": "WGS 84", "location.declination.comments": "igrf.m by Drew Compston", "location.declination.model": "IGRF-13", "location.declination.value": 13.1692334643401, "location.elevation": 335.2617645265, "location.elevation_uncertainty": 0.0, "location.latitude": 37.633351, "location.latitude_uncertainty": 0.0, "location.longitude": -121.468382, "location.longitude_uncertainty": 0.0, "location.x": 0.0, "location.x2": 0.0, "location.x_uncertainty": 0.0, "location.y": 0.0, "location.y2": 0.0, "location.y_uncertainty": 0.0, "location.z": 0.0, "location.z2": 0.0, "location.z_uncertainty": 0.0, "mth5_type": "Station", "orientation.method": "compass", "orientation.reference_frame": "geographic", "orientation.value": "orthogonal", "provenance.archive.name": "", "provenance.creation_time": "1980-01-01T00:00:00+00:00", "provenance.creator.author": "", "provenance.software.author": "Anna Kelbert, USGS", "provenance.software.name": "mth5_metadata.m", "provenance.software.version": "2024-03-11", "provenance.submitter.author": "", "run_list": [ "a", "b", "c", "d" ], "time_period.end": "2020-07-13T21:46:12+00:00", "time_period.start": "2020-06-02T18:41:43+00:00" } }

Changing Metadata

If you want to change the metadata of any group, be sure to use the write_metadata method. Here’s an example:

cas04.metadata.location.declination.value = -13.5
cas04.write_metadata()
print(cas04.metadata.location.declination)
declination:
	comments = igrf.m by Drew Compston
	model = IGRF-13
	value = -13.5

Have a look at a single channel

Let’s pick out a channel and interogate it. There are a couple ways

  1. Get a channel the first will be from the hdf5_reference [demonstrated here]

  2. Get a channel from mth5_object

  3. Get a station first then get a channel

ex = mth5_object.from_reference(ch_df.iloc[0].hdf5_reference).to_channel_ts()
print(ex)
Channel Summary:
	Survey:       CONUS South
	Station:      CAS04
	Run:          a
	Channel Type: Electric
	Component:    ex
	Sample Rate:  1.0
	Start:        2020-06-02T19:00:00+00:00
	End:          2020-06-02T22:07:46+00:00
	N Samples:    11267
ex.channel_metadata
{ "electric": { "ac.end": 0.0, "ac.start": 0.0, "channel_number": 0, "comments": "run_ids: [c,b,a]", "component": "ex", "contact_resistance.end": 0.0, "contact_resistance.start": 0.0, "data_quality.rating.value": null, "dc.end": 0.0, "dc.start": 0.0, "dipole_length": 92.0, "filters": [ { "applied_filter": { "applied": true, "comments": "practical to SI unit conversion", "name": "electric_si_units", "stage": 1 } }, { "applied_filter": { "applied": true, "comments": "electric dipole for electric field", "name": "electric_dipole_92.000", "stage": 2 } }, { "applied_filter": { "applied": true, "comments": "NIMS electric field 5 pole Butterworth 0.5 low pass (analog)", "name": "electric_butterworth_low_pass", "stage": 3 } }, { "applied_filter": { "applied": true, "comments": "NIMS electric field 1 pole Butterworth high pass (analog)", "name": "electric_butterworth_high_pass_30000", "stage": 4 } }, { "applied_filter": { "applied": true, "comments": "analog to digital conversion (electric)", "name": "electric_analog_to_digital", "stage": 5 } }, { "applied_filter": { "applied": true, "comments": "time offset in seconds (digital)", "name": "electric_time_offset", "stage": 6 } } ], "measurement_azimuth": 13.2, "measurement_tilt": 0.0, "negative.datum": "WGS 84", "negative.elevation": 335.3, "negative.elevation_uncertainty": 0.0, "negative.id": "200406D", "negative.latitude": 37.633351, "negative.latitude_uncertainty": 0.0, "negative.longitude": -121.468382, "negative.longitude_uncertainty": 0.0, "negative.manufacturer": "Oregon State University", "negative.model": "Pb-PbCl2 kaolin gel Petiau 2 chamber type", "negative.type": "electrode", "negative.x": 0.0, "negative.x2": 0.0, "negative.x_uncertainty": 0.0, "negative.y": 0.0, "negative.y2": 0.0, "negative.y_uncertainty": 0.0, "negative.z": 0.0, "negative.z2": 0.0, "negative.z_uncertainty": 0.0, "positive.datum": "WGS 84", "positive.elevation": 335.3, "positive.elevation_uncertainty": 0.0, "positive.id": "200406B", "positive.latitude": 37.633351, "positive.latitude_uncertainty": 0.0, "positive.longitude": -121.468382, "positive.longitude_uncertainty": 0.0, "positive.manufacturer": "Oregon State University", "positive.model": "Pb-PbCl2 kaolin gel Petiau 2 chamber type", "positive.type": "electrode", "positive.x": 0.0, "positive.x2": 0.0, "positive.x_uncertainty": 0.0, "positive.y": 0.0, "positive.y2": 0.0, "positive.y_uncertainty": 0.0, "positive.z": 0.0, "positive.z2": 0.0, "positive.z_uncertainty": 0.0, "sample_rate": 1.0, "time_period.end": "2020-06-02T22:07:46+00:00", "time_period.start": "2020-06-02T19:00:00+00:00", "type": "electric", "units": "digital counts" } }

Calibrate time series data

Most data loggers output data in digital counts. Then a series of filters that represent the various instrument responses are applied to get the data into physical units. The data can then be analyzed and processed. Commonly this is done during the processing step, but it is important to be able to look at time series data in physical units. Here we provide a remove_instrument_response method in the ChananelTS object. Here’s an example:

print(ex.channel_response)
ex.channel_response.plot_response(np.logspace(-4, 1, 50))
Filters Included:
=========================
{'name': 'electric_si_units', 'comments': {'author': 'None', 'time_stamp': {'time_stamp': '1980-01-01T00:00:00+00:00', 'gps_time': False}, 'value': None, '_class_name': 'comment'}, 'type': 'coefficient', 'units_in': 'milliVolt per kilometer', 'units_out': 'Volt per meter', 'calibration_date': {'time_stamp': '1980-01-01T00:00:00+00:00', 'gps_time': False}, 'gain': 1e-06, 'sequence_number': 1, '_class_name': 'coefficient_filter', 'total_gain': 1e-06}
--------------------
{'name': 'electric_dipole_92.000', 'comments': {'author': 'None', 'time_stamp': {'time_stamp': '1980-01-01T00:00:00+00:00', 'gps_time': False}, 'value': None, '_class_name': 'comment'}, 'type': 'coefficient', 'units_in': 'Volt per meter', 'units_out': 'Volt', 'calibration_date': {'time_stamp': '1980-01-01T00:00:00+00:00', 'gps_time': False}, 'gain': 92.0, 'sequence_number': 2, '_class_name': 'coefficient_filter', 'total_gain': 92.0}
--------------------
{'name': 'electric_butterworth_low_pass', 'comments': {'author': 'None', 'time_stamp': {'time_stamp': '1980-01-01T00:00:00+00:00', 'gps_time': False}, 'value': None, '_class_name': 'comment'}, 'type': 'zpk', 'units_in': 'Volt', 'units_out': 'Volt', 'calibration_date': {'time_stamp': '1980-01-01T00:00:00+00:00', 'gps_time': False}, 'gain': 1.0, 'sequence_number': 3, 'poles': array([ -3.883009+11.951875j,  -3.883009-11.951875j,
       -10.166194 +7.386513j, -10.166194 -7.386513j,
       -12.566371 +0.j      ]), 'zeros': array([], dtype=complex128), 'normalization_factor': 313383.493219835, '_class_name': 'pole_zero_filter', 'total_gain': 313383.493219835}
--------------------
{'name': 'electric_butterworth_high_pass_30000', 'comments': {'author': 'None', 'time_stamp': {'time_stamp': '1980-01-01T00:00:00+00:00', 'gps_time': False}, 'value': None, '_class_name': 'comment'}, 'type': 'zpk', 'units_in': 'Volt', 'units_out': 'Volt', 'calibration_date': {'time_stamp': '1980-01-01T00:00:00+00:00', 'gps_time': False}, 'gain': 1.0, 'sequence_number': 4, 'poles': array([-3.3e-05+0.j]), 'zeros': array([0.+0.j]), 'normalization_factor': 1.00000000015128, '_class_name': 'pole_zero_filter', 'total_gain': 1.00000000015128}
--------------------
{'name': 'electric_analog_to_digital', 'comments': {'author': 'None', 'time_stamp': {'time_stamp': '1980-01-01T00:00:00+00:00', 'gps_time': False}, 'value': None, '_class_name': 'comment'}, 'type': 'coefficient', 'units_in': 'Volt', 'units_out': 'digital counts', 'calibration_date': {'time_stamp': '1980-01-01T00:00:00+00:00', 'gps_time': False}, 'gain': 409600000.0, 'sequence_number': 5, '_class_name': 'coefficient_filter', 'total_gain': 409600000.0}
--------------------
{'name': 'electric_time_offset', 'comments': {'author': 'None', 'time_stamp': {'time_stamp': '1980-01-01T00:00:00+00:00', 'gps_time': False}, 'value': None, '_class_name': 'comment'}, 'type': 'time delay', 'units_in': 'digital counts', 'units_out': 'digital counts', 'calibration_date': {'time_stamp': '1980-01-01T00:00:00+00:00', 'gps_time': False}, 'gain': 1.0, 'sequence_number': 6, 'delay': -0.285, '_class_name': 'time_delay_filter', 'total_gain': 1.0}
--------------------

2026-01-09T21:18:03.831241-0800 | WARNING | mt_metadata.timeseries.filters.channel_response | complex_response | line: 372 | Filters list not provided, building list assuming all are applied
C:\Users\peaco\OneDrive\Documents\GitHub\mt_metadata\mt_metadata\timeseries\filters\plotting_helpers.py:246: UserWarning: Legend does not support handles for list instances.
A proxy artist may be used instead.
See: https://matplotlib.org/stable/users/explain/axes/legend_guide.html#controlling-the-legend-entries
  fig.legend(
Loading...
ex.remove_instrument_response(plot=True)
C:\Users\peaco\OneDrive\Documents\GitHub\mth5\mth5\timeseries\ts_filters.py:548: UserWarning: Attempt to set non-positive xlim on a log-scaled axis will be ignored.
  ax2.set_xlim((f[0], f[-1]))
Loading...
Channel Summary: Survey: CONUS South Station: CAS04 Run: a Channel Type: Electric Component: ex Sample Rate: 1.0 Start: 2020-06-02T19:00:00+00:00 End: 2020-06-02T22:07:46+00:00 N Samples: 11267

Have a look at a run

Let’s pick out a run, take a slice of it, and interogate it. There are a couple ways

  1. Get a run the first will be from the run_hdf5_reference [demonstrated here]

  2. Get a run from mth5_object

  3. Get a station first then get a run

run_from_reference = mth5_object.from_reference(ch_df.iloc[0].run_hdf5_reference).to_runts(start=ch_df.iloc[0].start.isoformat(), n_samples=360)
print(run_from_reference)
2026-01-09T21:18:41.053177-0800 | WARNING | mth5.timeseries.run_ts | validate_metadata | line: 1045 | end time of dataset 2020-06-02T19:05:59+00:00 does not match metadata end 2020-06-02T22:07:46+00:00 updating metatdata value to 2020-06-02T19:05:59+00:00
RunTS Summary:
	Survey:      CONUS South
	Station:     CAS04
	Run:         a
	Start:       2020-06-02T19:00:00+00:00
	End:         2020-06-02T19:05:59+00:00
	Sample Rate: 1.0
	Components:  ['ex', 'ey', 'hx', 'hy', 'hz']
run_plot = run_from_reference.plot()
Loading...

Calibrate Run

calibrated_run = run_from_reference.calibrate()
calibrated_run_plot = calibrated_run.plot()
Loading...

Load Transfer Functions

You can download the transfer functions for CAS04 and NVR08 from IRIS SPUD EMTF. This has already been done as EMTF XML format and will be loaded here.

cas04_tf = r"USMTArray.CAS04.2020.xml"
from mt_metadata.transfer_functions.core import TF
for tf_fn in [cas04_tf]:
    tf_obj = TF(tf_fn)
    tf_obj.read()
    mth5_object.add_transfer_function(tf_obj)
2023-12-15T15:46:03.198158-0800 | WARNING | mt_metadata.transfer_functions.io.emtfxml.metadata.helpers | _read_element | No declination in EMTF XML

Have a look at the transfer function summary

mth5_object.tf_summary.summarize()
tf_df = mth5_object.tf_summary.to_dataframe()
tf_df
Loading...

Plot the transfer functions using MTpy

Note: This currently works on branch mtpy/v2_plots

from mtpy import MTCollection
---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
Cell In[27], line 1
----> 1 from mtpy import MTCollection

ModuleNotFoundError: No module named 'mtpy'
mc = MTCollection()
mc.open_collection(r"8P_CAS04_NVR08")
pmr = mc.plot_mt_response(["CAS04", "NVR08"], plot_style="1")

Plot Station locations

Here we can plot station locations for all stations in the file, or we can give it a bounding box. If you have internet access a basemap will be plotted using Contextily.

st = mc.plot_stations(pad=.9, fig_num=5, fig_size=[6, 4])
st.fig.get_axes()[0].set_xlim((-121.9, -117.75))
st.fig.get_axes()[0].set_ylim((37.35, 38.5))
st.update_plot()
mth5_object.close_mth5()
2026-01-09T21:19:19.816936-0800 | INFO | mth5.mth5 | close_mth5 | line: 772 | Flushing and closing c:\Users\peaco\OneDrive\Documents\GitHub\mth5\docs\examples\notebooks\8P_CAS04.h5