optimap.motion¶

Motion estimation and compensation functions for optical mapping data.

The function motion_compensate() is main top-level function which combines all steps of the motion compensation pipeline for optical mapping data.

See Christoph and Luther[1] and Lebert et al.[2] for details.

optimap.motion.motion_compensate(video, contrast_kernel=7, ref_frame=0, presmooth_temporal=0.8, presmooth_spatial=0.8, postsmooth=None, method=None)[source]¶

Typical motion compensation pipeline for a optical mapping video.

First, the video is smoothed in space and time using a Gaussian kernel. Then, local contrast is enhanced to remove fluorescence signal. Finally, optical flow is estimated between every frame and a reference frame, and the video is warped to the reference frame using the estimated optical flow.

See motion.contrast_enhancement() and motion.estimate_displacements() for further details.

Parameters:
  • video (np.ndarray) – Video to estimate optical flow for (list of images or 3D array {t, x, y}). Can be any dtype because contrast enhancement will convert it to float32.

  • contrast_kernel (int, optional) – Kernel size for local contrast enhancement (must be odd), by default 7 See contrast_enhancement() for details.

  • ref_frame (int, optional) – Index of reference frame to estimate optical flow to, by default 0

  • presmooth_temporal (float, optional) – Standard deviation of smoothing Gaussian kernel in time, by default 0.8 See optimap.video.smooth_spatiotemporal() for details.

  • presmooth_spatial (float, optional) – Standard deviation of smoothing Gaussian kernel in space, by default 0.8 See optimap.video.smooth_spatiotemporal() for details.

  • postsmooth (tuple, optional) – Tuple of (wx, wy, wt) for Gaussian smoothing kernel in space and time, by default None. If None, no smoothing is applied. See smooth_displacements() for details.

  • show_progress (bool, optional) – Show progress bar, by default None

  • method (str, optional) – Optical flow method to use, by default None which means 'farneback' if a CUDA GPU is available, or 'farneback_cpu' otherwise

Returns:

Motion-compensated video of shape {t, x, y}

Return type:

np.ndarray

optimap.motion.reverse_motion_compensate(video_tracking, video_warping, contrast_kernel=7, ref_frame=0, presmooth_temporal=0.8, presmooth_spatial=0.8, postsmooth=None, method=None)[source]¶

Typical motion tracking pipeline to transform a video back into motion. E.g. we first motion compensated a recording and extracted the fluorescence wave dynamics. We now want to transform the processed, motion-less, video back into motion and e.g. overlay it on-top the original video with video.play_with_overlay().

See motion_compensate() and estimate_reverse_displacements() for explanation of parameters and further details.

Parameters:
  • video_tracking (np.ndarray) – Video to estimate optical flow for (list of images or 3D array {t, x, y}). Can be any dtype.

  • video_warping (np.ndarray) – Video to be warped based on the motion of the video_tracking data.

  • contrast_kernel (int, optional) – Kernel size for local contrast enhancement (must be odd), by default 7 See contrast_enhancement() for details.

  • ref_frame (int, optional) – Index of reference frame to estimate optical flow to, by default 0

  • presmooth_temporal (float, optional) – Standard deviation of smoothing Gaussian kernel in time, by default 0.8 See optimap.video.smooth_spatiotemporal() for details.

  • presmooth_spatial (float, optional) – Standard deviation of smoothing Gaussian kernel in space, by default 0.8 See optimap.video.smooth_spatiotemporal() for details.

  • postsmooth (tuple, optional) – Tuple of (wx, wy, wt) for Gaussian smoothing kernel in space and time, by default None. If None, no smoothing is applied. See smooth_displacements() for details.

  • show_progress (bool, optional) – Show progress bar, by default None

  • method (str, optional) – Optical flow method to use, by default None which means 'farneback' if a CUDA GPU is available, or 'farneback_cpu' otherwise

Return type:

np.ndarray

optimap.motion.estimate_displacements(video, ref_frame=0, show_progress=True, method=None)[source]¶

Calculate optical flow between every frame of a video and a reference frame. Wrapper around FlowEstimator for convenience. See FlowEstimator.estimate() for details.

Parameters:
  • video (np.ndarray) – Video to estimate optical flow for (list of images or 3D array {t, x, y})

  • ref_frame (int, optional) – Index of reference frame to estimate optical flow to, by default 0

  • show_progress (bool, optional) – Show progress bar, by default None

  • method (str, optional) – Optical flow method to use (default: 'farneback' if GPU is available, 'farneback_cpu' otherwise), by default None

Returns:

Optical flow array of shape {t, x, y, 2}

Return type:

np.ndarray

optimap.motion.estimate_reverse_displacements(video, ref_frame=0, show_progress=True, method=None)[source]¶

Calculate optical flow between every frame of a video and a reference frame. Wrapper around FlowEstimator for convenience. See FlowEstimator.estimate_reverse() for details.

Parameters:
  • video (np.ndarray) – Video to estimate optical flow for (list of images or 3D array {t, x, y})

  • ref_frame (int, optional) – Index of reference frame to estimate optical flow to, by default 0

  • show_progress (bool, optional) – Show progress bar, by default None

  • method (str, optional) – Optical flow method to use, by default None which means 'farneback' if a CUDA GPU is available, or 'farneback_cpu' otherwise

Returns:

Optical flow array of shape {t, x, y, 2}

Return type:

np.ndarray

optimap.motion.warp_video(video: ndarray, displacements: ndarray, show_progress=False, interpolation=2, borderMode=4, borderValue=0.0)[source]¶

Warps a video according to the given optical flow. Uses GPU if available.

Parameters:
  • video ({t, x, y} ndarray or list of images) – video to warp

  • displacements ({t, x, y, 2} ndarray or list of ndarrays) – optical flow fields

  • show_progress (bool, optional) – show progress bar

  • interpolation (int, optional) – interpolation method (see cv2.remap), by default cv2.INTER_CUBIC

  • borderMode (int, optional) – border mode (see cv2.remap), by default cv2.BORDER_REFLECT101

  • borderValue (float, optional) – border value (see cv2.remap), by default 0.0

Returns:

{t, x, y} ndarray

Return type:

np.ndarray

optimap.motion.warp_image(img: ndarray, displacement: ndarray, interpolation=2, borderMode=4, borderValue=0.0)[source]¶

Warps an image according to the given optical flow. Uses GPU if available.

Parameters:
  • img ({x, y} ndarray) – image to warp

  • displacement ({x, y, 2} ndarray) – optical flow field

  • show_progress (bool, optional) – show progress bar

  • interpolation (int, optional) – interpolation method (see cv2.remap), by default cv2.INTER_CUBIC

  • borderMode (int, optional) – border mode (see cv2.remap), by default cv2.BORDER_REFLECT101

  • borderValue (float, optional) – border value (see cv2.remap), by default 0.0

Returns:

{x, y} ndarray

Return type:

np.ndarray

optimap.motion.contrast_enhancement(video_or_img: ndarray, kernel_size: int, mask: ndarray = None)[source]¶

Amplifies local contrast to maximum to remove fluorescence signal for motion estimation. See Christoph and Luther [2018] for details.

Parameters:
  • video_or_img (np.ndarray) – {t, x, y} or {x, y} ndarray

  • kernel_size (int) – Kernel size for local contrast enhancement (must be odd)

  • mask (np.ndarray, optional) – valid values mask of shape {x, y} or {t, x, y}, by default None

Returns:

{t, x, y} or {x, y} ndarray of dtype np.float32

Return type:

np.ndarray

optimap.motion.smooth_displacements(displacements, wx: int, wy: int, wt: int, mask: ndarray = array([[0.]], dtype=float32))[source]¶

Smooths optical flow fields in space and time using a Gaussian kernel.

Parameters:
  • displacements (np.ndarray) – {t, x, y, 2} optical flow array

  • wx (int) – kernel size in x-direction

  • wy (int) – kernel size in y-direction

  • wt (int) – kernel size in t-direction

  • mask (np.ndarray, optional) – {x, y} mask to apply to flow field before smoothing, by default none

Returns:

{t, x, y, 2} smoothed optical flow array

Return type:

np.ndarray

optimap.motion.play_displacements(video, vectors, skip_frame=1, vskip=5, vcolor='red', vscale=1.0, title='video')[source]¶

Simple video player with displacement vectors displayed as arrows.

Parameters:
  • video ({t, x, y} ndarray) – video to play

  • vectors ({t, x, y, 2} ndarray) – Optical flow / displacement vectors.

  • skip_frame (int) – only show every n-th frame

  • vskip (int) – shows only every nth vector

  • vcolor (str) – color of the vectors

  • vscale (float) – scales the displayed vectors

  • title (str) – title of the plot

Return type:

matplotlib.animation.FuncAnimation

optimap.motion.play_displacements_points(video, vectors, skip_frame=1, vskip=10, vcolor='black', psize=5, title='')[source]¶

Simple video player with displacement vectors displayed as points.

Parameters:
  • video ({t, x, y} ndarray) – video to play

  • vectors ({t, x, y, 2} ndarray) – optical flow / displacement vectors.

  • skip_frame (int) – only show every n-th frame

  • vskip (int) – shows only every nth vector

  • vcolor (str) – color of the vectors, default: black

  • psize (int) – size of the points, default: 5

  • title (str) – title of the plot

Return type:

matplotlib.animation.FuncAnimation

class optimap.motion.FlowEstimator[source]¶

Bases: object

Optical flow estimator class which wraps OpenCV’s optical flow methods. Supports CPU and GPU methods (if CUDA is available). See OpenCV with CUDA support for how to install OpenCV with CUDA support.

See [Lebert et al., 2022] for a comparison and discussion of the different optical flow methods for optical mapping data. We recommend and default to the Farnebäck method.

List of supported optical flow methods:

The functions estimate_displacements() and estimate_reverse_displacements() functions are wrappers around this class for convenience.

Warning

This class expects images with values in the range [0,1]. Except for the 'brox' method, all images are internally converted to uint8 before calculating the optical flow. This is because the OpenCV CUDA optical flow methods only support uint8 images. This may lead to unexpected results if the images are not in the range [0,1].

farneback_params = {}¶

parameters for Farneback optical flow

tvl1_params = {}¶

parameters for TVL1 optical flow

lk_params = {}¶

parameters for LK optical flow

brox_params = {}¶

parameters for Brox optical flow

cuda_supported¶

whether CUDA is supported

default_method¶

default optical flow method

estimate(imgs: ndarray | list, ref_img: ndarray, method: str = None, show_progress: bool = True)[source]¶

Estimate optical flow between every frame of a video and a reference frame. The returned optical flow is an array of 2D flow fields which can be used to warp the video to the reference frame (motion compensation).

Parameters:
  • imgs (Union[np.ndarray, list]) – Video to estimate optical flow for (list of grayscale images or 3D array {t, x, y})

  • ref_img (np.ndarray) – Reference image to estimate optical flow to (grayscale image {x, y})

  • method (str, optional) – Optical flow method to use, by default None which means 'farneback' if a CUDA GPU is available, or 'farneback_cpu' otherwise

  • show_progress (bool, optional) – show progress bar, by default True

Returns:

optical flow array of shape {t, x, y, 2}

Return type:

np.ndarray

estimate_reverse(imgs: ndarray | list, ref_img: ndarray, method: str = None, show_progress: bool = True)[source]¶

Estimate optical flow between a reference frame and every frame of a video. The returned optical flow is an array of 2D flow fields which can be used to warp the video to the reference frame (motion compensation).

Parameters:
  • imgs (Union[np.ndarray, list]) – Video to estimate optical flow for (list of grayscale images or 3D array {t, x, y})

  • ref_img (np.ndarray) – Reference image to estimate optical flow to (grayscale image {x, y})

  • method (str, optional) – Optical flow method to use, by default None which means 'farneback' if a CUDA GPU is available, or 'farneback_cpu' otherwise

  • show_progress (bool, optional) – show progress bar, by default True

Returns:

optical flow array of shape {t, x, y, 2}

Return type:

np.ndarray

estimate_imgpairs(img_pairs, method: str = None, show_progress: bool = True)[source]¶

Estimate optical flow between list of two images.

Parameters:
  • img_pairs (list of tuples of np.ndarray) – List of grayscale image pairs to estimate optical flow for

  • method (str, optional) – Optical flow method to use, by default None

  • show_progress (bool, optional) – show progress bar, by default True

Returns:

optical flow array of shape {N, x, y, 2}

Return type:

np.ndarray