Tip
Download this tutorial as a Jupyter notebook
, or a python script
with code cells. We highly recommend using Visual Studio Code to execute this tutorial.
Tutorial 6: Activation Maps and Beat Detection¶
This tutorial demonstrates how to compute and visualize activation maps from cardiac optical mapping data using optimap
. Activation maps display local activation times (LATs), which indicate when the tissue becomes electrically activated. These maps are crucial for understanding cardiac electrical activity and identifying abnormalities.
Here, we will use an example data from Rybashlykov et al. [2022] in which a planar action potential wave propagates across the ventricle of a mouse heart.
import optimap as om
import monochrome as mc
import numpy as np
import matplotlib.pyplot as plt
filename = om.download_example_data("doi:10.5281/zenodo.5557829/mouse_41_120ms_control_iDS.mat")
video = om.load_video(filename)
metadata = om.load_metadata(filename)
print(f"Loaded video with shape {video.shape} and metadata {metadata}")
frequency = metadata["frequency"]
Downloading data from 'doi:10.5281/zenodo.5557829/mouse_41_120ms_control_iDS.mat' to file '/home/runner/work/optimap/optimap/docs/tutorials/optimap_example_data/mouse_41_120ms_control_iDS.mat'.
Loaded video with shape (1954, 200, 200) and metadata {'acqFreq': 977.0013873419699, 'dual': 0, 'frequency': 977.0013873419699}
The mouse_41_120ms_control_iDS.mat
file from the Zenodo dataset shows a induced pacing beats in a mouse heart. The load_metadata()
function loads the metadata from the MATLAB file, in this case the acquisition frame rate. We visualize the video using Monochrome:
# Show video
mc.show(video, name=filename.name, metadata=metadata)
We use optimap
’s background_mask()
function to create a mask of the heart:
# remove background by masking
mask = om.background_mask(video[0], show=False)
mask = om.image.dilate_mask(mask, iterations=5, show=False)
om.image.show_mask(mask, video[0], title="Background mask");
Creating mask with detected threshold 36.5

Generating an activation map (also known as isochrone map) is straightforward with optimap:
video_smoothed = om.video.smooth_spatiotemporal(video, sigma_temporal=1, sigma_spatial=2)
video_smoothed[:, mask] = np.nan
acttivation_map = om.compute_activation_map(video_smoothed[10:30],
falling_edge=True,
fps=frequency)

Here we did a few things:
We smoothed the video in space and time using a Gaussian filter.
We applied the background mask to the smoothed video.
We computed the activation map over the first pacing beat in the video (frames 10 to 30).
The activation map is a map of the local activation times (LATs) of the cardiac tissue. We will now go in more detail through the steps and options to compute the activation times and visualizing the activation maps.
Detecting Action Potentials¶
Computing activation maps is highly noise-sensitive, looking at the optical trace of a single pixel at position (100, 100) we can see how the smoothing is necessary to see the inverted action potential from the voltage-sensitive dye:
om.compare_traces([video, video_smoothed],
[(100, 100)],
size=1,
labels=["Original Video", "Smoothed"],
title="Optical trace of one pixel",
fps=frequency)

Note that the action potentials are inverted, i.e. the depolarization is a negative peak (negative signal / polarity). This is expected as the mouse heart was stained with the voltage-sensitive dye Di-4-ANEPPS fluorescent dye.
Let’s normalize the smoothed video to the range [0, 1] so that we can compare optical traces from different positions. We will use a pixelwise sliding window normalization here, but could have also used a regular pixel-wise normalization or a normalization off a frame-wise difference filtered video, see Tutorial 3.
video_norm = om.video.normalize_pixelwise_slidingwindow(video_smoothed, window_size=200)
video_norm[:, mask] = np.nan
Let’s visualize it:
om.show_video_pair(video, video_norm, title1="original video",
title2="normalized video", interval=10)
Or using Monochrome:
mc.show(video, name="original video")
mc.show_layer(1-video_norm, name="normalized video", cmap="blackbody", opacity='linear')
We can use the find_activations()
function to find pacing beats. Because the activation wave is so fast, we can just average the signal over the whole heart to identify when the pacing beat occurs:
activations = om.find_activations(1 - video_norm, fps=frequency, method="threshold_crossing")
print(f"Found {len(activations)} activation events at frames: {activations}")

Found 14 activation events at frames: [ 21 140 257 382 608 727 845 971 1196 1315 1432 1665 1784 1902]
The function return a list of activation times in frames and plots detected activations with a red line. We plotted 1 - signal
here to show the normal action potential shape.
The local activation time (LAT) is the time point at which a specific region of cardiac tissue depolarizes and becomes ‘electrically activated’. Two main methods exist to find action potential depolarizations and compute LATs:
Maximum derivative: The LAT is the time point where the rate of change of the fluorescence signal (its first derivative) is at its maximum. This corresponds to the steepest part of the upstroke. Maximum \(dV/dt\) is less sensitive to baseline drift and signal amplitude variations than other methods. See
optimap.activation.find_activations_dvdt()
for more details.Threshold crossing: The LAT is the time point when the (normalized) fluorescence signal crosses a predefined threshold during the upstroke. See
optimap.activation.find_activations_threshold()
for more details.
The function find_activations()
combines both methods, use the method
argument to select the method you want to use. The default is method='maximum_derivative'
, here we have used method='threshold_crossing'
with the default threshold of 0.5.
To get a more accurate estimate when the first activation occurs per pacing beat we can manually select a pixel close to the pacing site and run it again. We will use the maximum derivative method in this case, both work:
# Select a pixel position close to the pacing site to find activation events
#
# traces, coords = om.select_traces(video_norm[:500], size=10, ref_frame=video[0])
# trace = om.extract_traces(video_norm, coords[0], size=10)
# activations = om.activation.find_activations(1 - trace)
# Here hardcoded position (141, 100) for demo purposes
fig, axs = plt.subplot_mosaic('ABB', figsize=(10, 2))
om.show_positions([(141, 100)], video[0], ax=axs['A'])
trace = om.extract_traces(video_smoothed, (141, 100), size=10)
pacing_events = om.activation.find_activations(1 - trace, ax=axs['B'], fps=frequency)
plt.show()
print(f"Found {len(pacing_events)} activation events at frames: {pacing_events}")

Found 14 activation events at frames: [ 14 132 249 374 601 719 837 966 1188 1307 1424 1658 1777 1894]
Notice that we didn’t need to use the normalized signal here. The threshold_crossing
method requires a normalized signal to work properly, while the maximum_derivative
method works well with the raw signal.
Let’s visualize the wave propagating across the ventricles for the first pacing beat:
figure, axs = plt.subplots(1, 7, figsize=(10, 3))
om.show_image(video[0], ax=axs[0], title='original')
for i in range(0, 6):
time = i*2 * (1000/frequency) # convert frames to ms
om.show_image(video_norm[pacing_events[0] + i*2], title=f"{time:.1f} ms", ax=axs[i+1])
plt.show()

Computing Local Activation Times¶
Let’s plot some of the optical traces (manually selected so that they show locations which become subsequently activated):
# Positions selected with the GUI:
# positions = om.select_positions(video[0])
positions = [(134, 101), (14, 93), (94, 99), (53, 97)]
fig, axs = plt.subplots(1, 2, figsize=(10,5))
om.trace.show_positions(positions, video[0], ax=axs[0])
traces = om.extract_traces(video_norm,
positions,
fps=frequency,
ax=axs[1])
axs[1].axhline(y=0.5, color='r', linestyle='dashed', label='threshold')
axs[1].text(0.03, 0.52, 'threshold', color='r')
plt.xlim(0, 0.12)
plt.show()

Here we have marked the threshold at 0.5 which used used for the threshold_crossing
method. When we compare the LATs between the two methods we can see that both work well with slight differences of one frame.
Note that the functionfind_activations()
will return the closest frame index to the detection condition. This means that if the threshold is crossed in between two frames for the threshold_crossing
method, the function will return the index of the frame that is closest to the threshold crossing.
fig, axs = plt.subplots(1, 2, figsize=(10, 4.5))
ax=axs[0]
colors = ['blue', 'orange', 'green', 'red']
xlim = (130, 146)
for ax, method in zip(axs, ['threshold_crossing', 'maximum_derivative']):
om.show_traces(traces, ax=ax, colors=colors, linestyle='solid', marker='.')
activations = om.activation.find_activations(traces, method=method, falling_edge=True, show=False)
for i in range(len(activations)):
for activation in activations[i]:
if xlim[0] <= activation <= xlim[1]:
ax.axvline(activation, linestyle='--', color=colors[i], alpha=0.6)
ax.text(activation, 0.35, f'LAT: {activation:.1f}',
rotation=90, va='top', ha='right', color=colors[i], fontsize=10)
if method == 'threshold_crossing':
ax.axhline(y=0.5, color='r', linestyle='dashed')
ax.text(xlim[0] + 1, 0.51, 'threshold', color='r')
ax.set_xlim(*xlim)
ax.grid()
ax.set_title(f"{method=}")
fig.suptitle("Detected local activation times (LATs)")
plt.show()

Computing Activation Maps¶
We can now compute an activation map by identifying the local activation times in each pixel that correspond to when the action potential wave front passes through that pixel.
optimap
’s compute_activation_map()
function automatically computes a two-dimensional activation map which shows the local activation times in every pixel:
t = pacing_events[2]
activation_map = om.compute_activation_map(
video_norm[t - 3:t + 16],
falling_edge=True,
fps=frequency,
show_contours=True
)

compute_activation_map()
uses find_activations()
internally and supports both LAT detection methods. We again used the argument falling_edge=True
due to the negative polarity of the signal (\(- \Delta F / F\)). If we had manually inverted the video beforehand or with calcium imaging data this would not be necessary.
Because our mask was automatically generated, we have a lot of pixels which are not part of the ventricles. Let’s improve the mask manually to create better activation maps.
# Refine mask manually with the GUI:
#
# mask = om.interactive_mask(image=video[0], initial_mask=mask)
# om.save_mask('mouse_41_120ms_control_iDS_mask.png', mask)
# Loading the mask from the file for demo purposes
mask_filename = om.download_example_data('mouse_41_120ms_control_iDS_mask.png')
mask = om.load_mask(mask_filename)
video_norm[:, mask] = np.nan
Downloading data from 'https://cardiacvision.ucsf.edu/sites/g/files/tkssra6821/f/optimap-mouse_41_120ms_control_iDS_mask.png_.webm' to file '/home/runner/work/optimap/optimap/docs/tutorials/optimap_example_data/mouse_41_120ms_control_iDS_mask.png'.
Now let’s run it again. In the activation map below which show contour lines, which are a powerful visualization tool and help highlight the wavefront propagation. Contour lines are not shown by default, but can be added by setting show_contours=True
in the compute_activation_map()
or show_activation_map()
functions. Here we also define custom contour levels of our choice:
activation_map = om.compute_activation_map(
video_norm[t - 3:t + 17],
falling_edge=True,
fps=frequency,
show_contours=True,
contour_levels=[3, 6, 9, 12],
vmax=13,
)

The activation map is a 2D array with the LAT value in terms of number of frames for each pixel. Both compute_activation_map()
and find_activations()
return results in the unit frames, the plotting functions activation.show_activations()
and show_activation_map()
convert it to milliseconds using the fps
frame rate parameter.
If normalize_time
is set to True
(default) the minimum LAT is subtracted from all LAT values, so that the first activation time is at frame 0.
om.print_properties(activation_map)
------------------------------------------------------------------------------------------
array with dimensions: (200, 200)
datatype of array: float32
minimum value in entire array: 0.0
maximum value in entire array: 14.0
number of NaNs in array: 18061
------------------------------------------------------------------------------------------
In this example the latest activation (LAT) at frame 14.
Let’s compare the two LAT detection methods:
fig, axs = plt.subplots(1, 2, figsize=(10, 4))
for ax, method in zip(axs, ['maximum_derivative', 'threshold_crossing']):
om.compute_activation_map(
video_norm[t - 3:t + 17],
falling_edge=True,
fps=frequency,
method=method,
show_contours=True,
contour_levels=[3, 6, 9, 12],
vmax=13,
ax=ax,
title=f"{method=}",
)
plt.show()

As we can see both methods work well and produce similar results.
compute_activation_map()
uses find_activation()
to calculate the local activation time (LAT) for each pixel. By default they return discrete frame indices. When interpolation=True
is set in either function, the LAT is calculated as the exact fraction of a frame using interpolation (right panel):
fig, axs = plt.subplots(1, 2, figsize=(10, 5))
ax=axs[0]
colors = ['blue', 'orange', 'green', 'red']
xlim = (130, 146)
for ax, interpolate in zip(axs, [False, True]):
om.show_traces(traces, ax=ax, colors=colors, linestyle='solid', marker='.')
activations = om.activation.find_activations(1 - traces, method='threshold_crossing', interpolate=interpolate, show=False)
for i in range(len(activations)):
for activation in activations[i]:
if xlim[0] <= activation <= xlim[1]:
ax.axvline(activation, linestyle='--', color=colors[i], alpha=0.6)
ax.text(activation, 0.45, f'LAT: {activation:.1f}',
rotation=90, va='top', ha='right', color=colors[i], fontsize=10)
ax.axhline(y=0.5, color='r', linestyle='dashed')
ax.text(xlim[1] - 3, 0.51, 'threshold', color='r')
ax.set_xlim(*xlim)
ax.grid()
ax.set_title(f"interpolate=True" if interpolate else "interpolate=False (default)")
fig.suptitle("Non-interpolated vs interpolated LATs")
plt.show()

Here is the comparison of the activation maps with and without LAT interpolation:
activation_map_smooth = om.compute_activation_map(
video_norm[t-3:t+17],
falling_edge=True,
interpolate=True,
show=False
)
fig, axs = plt.subplots(1, 3, figsize=(15, 6))
om.show_activation_map(activation_map,
ax=axs[0],
show_colorbar=False,
title='interpolate=False',
vmax=15)
om.show_activation_map(activation_map_smooth,
ax=axs[1],
show_colorbar=False,
title='interpolate=True',
vmax=15)
om.show_activation_map(activation_map_smooth,
ax=axs[2],
show_contours=True,
contour_levels=[3, 6, 9, 12, 15],
contour_fmt=' %.1f ms ',
vmax=15,
title='with contours')
plt.show()

See show_activation_map()
for more options to customize the activation map visualization.
Let’s compare activation maps across subsequent pacing beats:
fig, axs = plt.subplots(4, 3, figsize=(10, 10))
axs = axs.flatten()
for i in range(0, 12):
t = pacing_events[i]
om.compute_activation_map(
video_norm[t - 5:t + 20],
title=f"Beat {i}",
falling_edge=True,
method="threshold_crossing",
fps=frequency,
ax=axs[i],
show_colorbar=False,
vmax=16
)
plt.tight_layout()
plt.show()

Let’s look at beat 7 more closely, and overlay the contour lines on top of the raw image. The contour levels are set to 2 ms intervals, but you can adjust them as needed.
Note
You can also combine contour lines with a raw image to visualize the propagation path over the heart tissue:
t = pacing_events[7]
activation_map_7 = om.compute_activation_map(video_norm[t - 4:t + 20], falling_edge=True, interpolate=True, show=False)
fig, ax = plt.subplots(figsize=(8, 6))
# Show the raw camera image
om.show_image(video[t], ax=ax)
# Add a black boundary line around the mask
mask_boundary = ax.contour(~mask, levels=[0], colors='black', linewidths=1.5, alpha=0.8)
# Show contours
om.show_activation_map(
activation_map_7,
ax=ax,
fps=frequency,
show_map=False,
show_contours=True,
contour_levels=range(2, 20, 2),
contour_fontsize=10,
contour_args={'linewidths': 1.5, 'alpha': 0.8, 'cmap': 'turbo', 'colors': None})
plt.tight_layout()
plt.show()

We have plotted the activation map using the turbo
colormap, but you can choose any colormap you prefer using the cmap
argument. vmin
and vmax
can be used to set the minimum and maximum values for the color scale.
fig, axs = plt.subplots(1, 3, figsize=(10, 4))
om.show_activation_map(activation_map, cmap='jet', show_colorbar=True, title='cmap=jet', ax=axs[0], fps=frequency, colorbar_title="", vmax=16)
om.show_activation_map(activation_map, cmap='magma', show_colorbar=True, title='cmap=magma', ax=axs[1], fps=frequency, colorbar_title="", vmax=16)
om.show_activation_map(activation_map, cmap='twilight_shifted', show_colorbar=True, fps=frequency, title='cmap=twilight_shifted', ax=axs[2], colorbar_title="", vmax=16)
plt.suptitle('Activation maps with different colormaps')
plt.show()

Using matplotlib.pyplot.quiver()
we can visualize the propagation direction of the wavefront. The quiver
function creates a 2D field of arrows.
# Smooth activation map and calculate gradient of the activation map
activation_map_smooth2 = om.image.smooth_gaussian(activation_map_smooth, sigma=2)
dy, dx = np.gradient(activation_map_smooth2)
# Filter out 0.01% of the largest gradient values
gradient_magnitude = np.sqrt(dx**2 + dy**2)
threshold = np.nanpercentile(gradient_magnitude, 99.9)
dx[gradient_magnitude > threshold] = np.nan
dy[gradient_magnitude > threshold] = np.nan
step = 8 # adjust step size for arrow density
shape = activation_map_smooth2.shape
y_indices, x_indices = np.mgrid[:shape[0], :shape[1]]
fig, axs = plt.subplots(1, 2, figsize=(8, 4.5))
ax = axs[0]
om.show_activation_map(activation_map_smooth2, ax=ax, fps=frequency, vmax=15)
ax.quiver(x_indices[::step, ::step],
y_indices[::step, ::step],
dx[::step, ::step],
dy[::step, ::step],
color='black',
pivot='mid',
angles='xy',
scale=4)
# Add a black boundary line around the mask
mask_boundary = ax.contour(~mask, levels=[0], colors='black', linewidths=1.5, alpha=0.8)
ax = axs[1]
step = 12
om.show_image(video[0], ax=ax)
ax.quiver(x_indices[::step, ::step],
y_indices[::step, ::step],
dx[::step, ::step],
dy[::step, ::step],
activation_map_smooth2[::step, ::step] / 15,
width=0.005,
cmap='turbo',
pivot='mid',
angles='xy',
scale=3)
plt.tight_layout()
plt.show()

Similarly matplotlib.pyplot.streamplot()
can also be used to visualize the propagation direction of the wavefront. The streamplot
function creates a 2D field of streamlines, which are lines that follow the flow of the vector field.
# Smooth the activation map and calculate the gradient
fig, axs = plt.subplots(1, 2, figsize=(8, 4.5))
ax = axs[0]
om.show_activation_map(activation_map_smooth2, ax=ax, fps=frequency, vmax=15)
ax.streamplot(x_indices, y_indices, dx, dy, color='white', linewidth=1)
ax = axs[1]
om.show_image(video[0], ax=ax)
ax.streamplot(x_indices, y_indices, dx, dy, color=activation_map_smooth2, cmap='turbo', linewidth=1, arrowsize=1.5)
fig.tight_layout()
fig.suptitle("Action Potential Wave Propagation Direction using streamplot")
plt.show()

Next Steps¶
In the Tutorial 7 we will continue with the analysis of the activation maps and show how to compute the conduction velocity (CV) of the action potential wavefront.