Tools for Processing Anatomical Data

CIVET

The CIVET project was initiated in order to create an automated easy-to-use human brain-imaging pipeline making use of state-of-the-art software tools developed by researchers at the BIC for the fully-automated processing and analysis of large MR data sets, including the extraction and analysis of cortical surfaces from MR images, as well as many other volumetric and corticometric functions. 


N3

The N3 package, a part of MINC tools, implements a non-parametric method for correction of intensity non-uniformity in MRI data. Its use tends to be an essential first step in any processing sequence.


Registration tools/ANIMAL

This family of algorithms linearly and nonlinearly register two images to each other. The most used, mritotal registers an MRI to standard Talairach space. They are all part of the MNI AutoReg package. ANIMAL is also part of MNI Autoreg and was designed to label the major anatomical regions (the different lobes, corpus callosum, etc.) of a MRI.

Registration of MINC volumes

There are numerous scripts used to automatically linearly and nonlinearly register one MINC volume to another.

  • Linear Registration

mritotal, mritoself, bestlinreg.pl and other MNI AutoReg tools with additional documentation in wikibooks.

  • Nonlinear Registration

bestnonlinreg.pl is the most commonly used tool.

  • Monkey versions

monkey_best_lin_reg.pl and monkey_mritotal.pl are the scripts used to linearly and nonlinearly register monkeys to a monkey atlas such as the MNI Macaque Atlas.


INSECT

INSECT is the algorithm to separate a structural MRI into it’s three tissue types: white matter, gray matter, and CSF. It is available as part of the classify packages in the packages.bic.mni.mcgill.ca/ directory.


BEaST

BEaST is a robust brain extraction method based on a multi-resolution patch-based framework

Read more

Brain extraction is an important step in the analysis of brain images. The variability in brain morphology and the difference in intensity characteristics due to imaging sequences make the development of a general purpose brain extraction algorithm challenging. To address this issue, we propose a new robust method (BEaST) dedicated to produce consistent and accurate brain extraction. This method is based on nonlocal segmentation embedded in a multi-resolution framework. A library of 80 priors is semi-automatically constructed from the NIH-sponsored MRI study of normal brain development, the International Consortium for Brain Mapping, and the Alzheimer’s Disease Neuroimaging Initiative databases.

In testing, a mean Dice similarity coefficient of 0.9834±0.0053 was obtained when performing leave-one-out cross validation selecting only 20 priors from the library. Validation using the online Segmentation Validation Engine resulted in a top ranking position with a mean Dice coefficient of 0.9781±0.0047. Robustness of BEaST is demonstrated on all baseline ADNI data, resulting in a very low failure rate. The segmentation accuracy of the method is better than two widely used publicly available methods and recent state-of-the-art hybrid approaches. BEaST provides results comparable to a recent label fusion approach, while being 40 times faster and requiring a much smaller library of priors.

Read more: downloads and tutorial.


CANDLE

CANDLE is a Collaborative Approach for eNhanced Denoising under Low-light Excitation (CANDLE) for the processing of 3D laser scanning multiphoton microscopy images.

Read more

A new Collaborative Approach for eNhanced Denoising under Low-light Excitation (CANDLE) is introduced for the processing of 3D laser scanning multiphoton microscopy images. CANDLE is designed to be robust for low signal-to-noise ratio (SNR) conditions typically encountered when imaging deep in scattering biological specimens. Based on an optimized non-local means filter involving the comparison of filtered patches, CANDLE locally adapts the amount of smoothing in order to deal with the noise inhomogeneity inherent to laser scanning fluorescence microscopy images. An extensive validation on synthetic data, images acquired on microspheres and in vivo images is presented. These experiments show that the CANDLE filter obtained competitive results compared to a state-of-the-art method and a locally adaptive optimized nonlocal means filter, especially under low SNR conditions (PSNR<8dB). Finally, the deeper imaging capabilities enabled by the proposed filter are demonstrated on deep tissue in vivo images of neurons and fine axonal processes in the Xenopus tadpole brain.

Reference

P. Coupé, M. Munz, J. V. Manjon, E. Ruthazer, D. L. Collins.
A CANDLE for a deeper in-vivo insight, accepted in Medical Image Analysis, 2012.
(This is the author’s version of the paper, full reference coming soon.)

Software Download

The Matlab software with documentation and example images is available for download here: https://sites.google.com/site/pierrickcoupe/softwares/denoising-for-medical-imaging/multiphoton-filtering


SEAL

SEAL stands for “Sulcal Extraction and Labelling” - which pretty much explains its use as well.

                           

The Neuro logoMcGill logoMcGill University Health Centre logoKillam logo

                                                                                                                 FacebookinstagramtwitterlinkedInyoutube

Back to top