Acessibilidade / Reportar erro

Multimodality Imaging Registration: A Case Study Applied to the Thyroid Graves’ Disease

Abstract

The use of different medical imaging modalities is becoming more frequently engaged in medical applications. This is mainly due to the expansion of different computational processing techniques, which facilitates not only diagnosis but also the follow-up care during medical treatments. The use of anatomical medical images, acquired by computerized tomography (CT) or magnetic resonance imaging (MRI), can be registered with the functional images of the human body, obtained by other exams, such as infrared thermography and scintigraphy. Therefore, this combination of different imaging modalities is representing an innovative perspective in medicine. The methodology presented in this paper allows the imaging fusion and achieves new 2D images, which merge different modalities altogether. The image data acquired are aligned based on the affine registration, where at least three corresponding pairs of points are selected in both images. For this purpose, it was developed a script in MATLAB®. This processing allows the combination of different pairs of images, as it also applies transparencies into them. The results obtained can grant the visualization of the thyroid and other structures of the human anatomy simultaneously into a single aligned image. In this research, the focus is to perform imaging fusion of anatomical and functional images. As example, the endocrinology case study proposed was applied for a Graves’ disease. This is a hyperthyroidism condition that causes the thyroid gland to hyper-uptakes related to the radiopharmaceuticals being used during the scintigraphy imaging. Additionally, this image was also blended with infrared thermography image, among other further combinations. Innovative results are presented based on the superposition/overlap of a pair of images, allowing physicians to evaluate both anatomical and functional images. Therefore, this paper presents the investigation of an endocrinological application improving diagnostic sensitivity.

Keywords:
Thyroid; Image Fusion; Affine Registration; Alignment; Graves’ Disease

HIGHLIGHTS

Proposed a visualization of unified images, helping the diagnosis.

Complement treatments, in a more reliable way than only single images cannot provide.

Visualization of thyroid and other structures at once in a single aligned image.

Evaluate the imaging fusion for inspection of Graves' disease.

INTRODUCTION

The use of medical imaging in medicine to define diagnoses and follow the evolution of pathologies is very widespread. The most used modalities are X-Rays, Computerized Tomography (CT), Magnetic Resonance Imaging (MRI), Ultrasonography (US), and Nuclear Medicine. All these modalities are divided into two main categories. For example, the anatomical information is obtained through CT or MRI, either based on 2D single slices or 3D imaging reconstruction. The functional approach is obtained from nuclear medicine, such as scintigraphy. Overall, these imaging modalities can only be evaluated by the medical team, who can really provide interpretation [11 De Souza MA, Bueno AP, Magas V, Nogueira Neto GN, Nohama P. Imaging Fusion Between Anatomical and Infrared Thermography of Thyroid Gland. GMEPE/PAHCE 2019; 1-5., 22 Ring EF, Ammer K. Infrared thermal imaging in medicine. Physiol Meas. 2012; 33: 33-46.].

Endocrinology comprises the study of glands and hormones present in human beings, and the use of images is an essential tool to enable the visualization of the region of interest (ROI). In this paper, the proposed fusions methodology is demonstrated through medical images of the neck region specifically those containing the thyroid gland, which has great clinical importance due to the various pathologies that can occur in it.

The follow-up care of patients with alterations in this gland is carried out through hormonal tests (TSH, T3, Free T3, T4, and Free T4), which are analyzed through blood samples. However, these laboratory tests have some limitations [33 Dufour DR. Laboratory Tests of Thyroid Function: Uses and Limitations. Endocrinol Metab Clin North Am. 2007; 36(3): 579-94.]. In this case, additional exams are used to inspect the thyroid region, and medical imaging approaches are included to evaluate the volume, geometry, thyroid anatomy, and its respective functionality.

The use of superimposed anatomical and functional images allows a joint visualization of anatomical structures, including the size of the thyroid, which provides the location of temperature variations, through the identification of nodules or regions with higher hormone production, generating a hypercaptation condition. This is an advantage that would not be possible only with the visualization of the single images on side-by-side. It is possible to evaluate the thyroid in its entirety and observe it from different points of view, allowing an accurate diagnosis and specific treatment. The evaluation of the fused images allows joint analysis of both anatomical (obtained by CT) and functional (from infrared and scintigraphy) images. Additionally, from the CT images, it is possible to perform a three-dimensional (3D) reconstruction, which will frame together both the anatomical and infrared (IR) thermography as well. Therefore, imaging fusion improves sensitivity, facilitates diagnosis, and provides a more detailed and quantitative study of the various anatomical structures, as well as facilitates the analysis of physiological processes [44 Sanches IJ. [Superposition of thermography and magnetic resonance images: a new tridimensional medical image modality] [thesis]. Curitiba: Universidade Tecnológica Federal do Paraná, Programa de Pós-Graduação em Engenharia Elétrica e Informática Industrial; 2009. 170 p. Brazilian Portuguese.].

The data combination, obtained through two different imaging modalities, into a single image, makes possible to determine the extent of the anatomical area (MRI or CT) and the physiological impairments (infrared image or scintigraphy). Thus, it helps diagnosis with greater precision, including local alterations and its physiological effect. The fusion of medical images, which can be performed through a registration process, allows superimposing through an alignment procedure available in two different imaging modalities into a single and unique image [55 Maintz JBA, Viergever MA. A Survey of Medical Image Registration. Med Image Anal. 1998; 2(1): 1-37, 66 Modersitzki J. Numerical Methods for Image Registration. Oxford University Press Inc. New York, USA, 2004, 77 Zitová B, Flusser J. Image Registration Methods: A Survey. Image Vis. Comput. 2003; 21: 977-1000, 88 Viergever MA, Maintz JB, Klein S, Murphy K, Staring M, Pluim JP. A survey of medical image registration - under review. Med Image Anal. 2016; 33: 140-4.].

Infrared thermography consists of a painless, fast, low-cost, and radiation-free image acquisition. This technique is based on the acquisition of infrared radiation, which is produced by the human body itself, in which the thermal sensors are responsible for capturing such radiation. From these images, it is visualized the temperature variations of the tissues in the body region (ROI), and even it is observed abnormalities [22 Ring EF, Ammer K. Infrared thermal imaging in medicine. Physiol Meas. 2012; 33: 33-46.].

The scintigraphy exam has the aim to evaluate the functioning of the thyroid, as it is possible to carry out the identification of existing nodules or abnormalities. For this tissue evaluation to be performed, the patient receives a dose of radiopharmaceuticals (e.g., Iodine 131, Iodine 123, or Technetium 99m); which allows high-quality images to be obtained. In addition to, it is also possible to diagnose whether hypothyroidism, hyperthyroidism, or other thyroid pathologies has been developed and inspected over time [99 Günay O, Sarihan M, Yarar O, Akkurt İ, Demir M. Measurement of Radiation Dose in Thyroid Scintigraphy. Acta Phys. Pol. A. 2020; 137(4): 569-73.].

The CT images (in DICOM format) are visualized on specific software where it is performed the manual or automatic segmentation and the respective 3D reconstruction of the gland. Then, this process allows the frontal view 3D CT geometry visualization of the entire region, which is visualized in 3D format as well [1010 Chang C, Chung P, Hong Y, Tseng C. A Neural Network for Thyroid Segmentation and Volume Estimation in CT Images. IEEE Comput Intell Mag. 2011;6(4):43-55.].

This paper presents a new fusion registration methodology, that allows the visualization of the thyroid region with different pairs of medical imaging modalities altogether, which are: infrared thermography; scintigraphy; 2D CT coronal slices; and frontal view 3D CT geometry.

LITERATURE REVIEW

Imaging modalities can be divided regarding their physics methods acquisition: anatomical and functional. Anatomical imaging represents the body morphology, and functional imaging represents body interaction with different radiopharmaceuticals [55 Maintz JBA, Viergever MA. A Survey of Medical Image Registration. Med Image Anal. 1998; 2(1): 1-37, 88 Viergever MA, Maintz JB, Klein S, Murphy K, Staring M, Pluim JP. A survey of medical image registration - under review. Med Image Anal. 2016; 33: 140-4., 1111 Hattori R, Komiyama T. PatchWarp: Corrections of non-uniform image distortions in two-photon calcium imaging data by patchwork affine transformations. Cell Rep Methods. 2022; 2(5):100205.]. Some modalities also use combined information, using both anatomical and functional altogether [12 12 Goshtasby AA. Theory and applications of image registration. 1st ed. Wiley; 2017. 484 p.

13 Azam MA, Khan KB, Ahmad M, Mazzara M. Multimodal Medical Image Registration and Fusion for Quality Enhancement. CMC. 2021; 68: 821-40.

14 Shi Y, Jiang X, Li S. Fusion algorithm of UAV infrared image and visible image registration. Soft Comput. 2021.
-1515 Li D, Chen L, Bao W, Sun J, Ding B, Li Z. An improved image registration and fusion algorithm. Wireless Netw. 2021;27:3597-611.].

Combing these modalities to supply information from both systems help to elucidate different clinical problems. This approach is also known as image registration, which provides further contributions, like treatment evaluation and diseases tracking. It is also particularly helpful when the images alone cannot bring conclusive diagnosis and monitoring over time [16 16 Haskins G, Kruger U, Yan P. Deep learning in medical image registration: a survey. Mach Vis Appl. 2020; 31(8): 1-18.

17 González JR, Rodrigues ÉO, Damião CP, Fontes CAP, Silva AC, Paiva AC, Li H, Du C, Conci A. An Approach for Thyroid Nodule Analysis Using Thermographic Images. Series in BioEngineering Application of Infrared to Biomedical Sciences. 2017; Chapter 26: 451-75.

18 Brown, LG. A survey of image registration techniques. ACM Comput Surv. 1992; 24(4): 325-376.

19 Shusharina N, Heinrich MP, Huang R. Segmentation, Classification, and Registration of Multi-modality Medical Imaging Data. MICCAI 2020 Challenges, ABCs 2020, L2R 2020, TN-SCUI 2020, Held in Conjunction with MICCAI 2020, Lima, Peru, October 4-8, 2020. Proceedings de Lecture Notes in Computer Science Image Processing, Computer Vision, Pattern Recognition, and Graphics, Springer Nature, 2021; Vol. 12587.
- 2020 González JR, Pupo Y, Hernandez M, Conci A, Machenry T, Fiirst W. On Image Registration for Study of Thyroid Disorders by Infrared Exams. Proceedings of the International Conference on Image Processing, Computer Vision, and Pattern Recognition; Athens, 2018.].

A. Image Registration Methods

Imaging registration is considered as a fundamental task in several imaging processing, allowing to map and transform two different images, into a common coordinate system to align and visualize both images altogether [55 Maintz JBA, Viergever MA. A Survey of Medical Image Registration. Med Image Anal. 1998; 2(1): 1-37, 88 Viergever MA, Maintz JB, Klein S, Murphy K, Staring M, Pluim JP. A survey of medical image registration - under review. Med Image Anal. 2016; 33: 140-4., 2121 Chee E, Wu Z. AIRNet: Self-Supervised Affine Registration for 3D Medical Images using Neural Networks. ArXiv.2018.]. The transformation is necessary when images are obtained from different imaging sensors, resolutions, time acquisitions, or even different positions.

Every registration method represents a set of transformation equations, which transforms the coordinates of one image into the correspondent coordinates from another image [2222 Van Den Elsen PA, Pol E-JD, Viergever MA. Medical Image Matching - A Review with Classification. IEEE Eng Med Biol. 1993; 12(1): 26-39.]. Different coordinates transformations can be applied, such as identity, rigid, affine, and nonrigid. These can also be classified as global or local [55 Maintz JBA, Viergever MA. A Survey of Medical Image Registration. Med Image Anal. 1998; 2(1): 1-37, 88 Viergever MA, Maintz JB, Klein S, Murphy K, Staring M, Pluim JP. A survey of medical image registration - under review. Med Image Anal. 2016; 33: 140-4., 1111 Hattori R, Komiyama T. PatchWarp: Corrections of non-uniform image distortions in two-photon calcium imaging data by patchwork affine transformations. Cell Rep Methods. 2022; 2(5):100205.].

The main difference between rigid and nonrigid registration techniques is the nature of the transformation. The goal of rigid registration is to find the six degrees of freedom (three rotations and three translations) of transformation, as shown in Equation 1 with homogeneous coordinates, which maps any point in the source image into the corresponding point in the target image [2323 Rueckert D. Nonrigid Registration: Concepts, Algorithms, and Applications, In: Medical Image Registration. CRC Press. 2001. 281-302]. For each point (x,y,z) in an image, a rigid mapping can be defined into the coordinates of another space (x',y',z').

T x , y , z = x ' y ' z ' 1 = c o s β c o s γ c o s α s i n γ + s i n α s i n β c o s γ s i n α s i n γ - c o s α s i n β c o s γ t x - c o s β s i n γ c o s α c o s γ - s i n α s i n β s i n γ s i n α c o s γ + c o s α s i n β s i n γ t y s i n β 0 - s i n α c o s β 0 c o s α c o s β t z 0 1 x y z 1 (1)

Where α, β and γ represent rotation around the x,y andz axis. Tx,Ty and Tz represent translation along the x,y and z axis.

An extension of this model is the affine transformation model which has twelve degrees of freedom and allows also scaling and shearing. Any combination of translation, rotations, scaling and shears can be combined in a single 4 by 4 3D affine transformation matrix, as shown in Equation 2. Inverse affine transformations are obtained by inverting the transformation matrix.

T ( x , y , z ) = ( x ' y ' z ' 1 ) = ( a 00 a 01 a 02 a 03 a 10 a 11 a 12 a 13 a 20 0 a 21 0 a 22 a 23 0 1 ) ( x y z 1 ) (2)

B. Imaging Fusion

Image fusion consists of joining information obtained from different data sources, while information registration computes the geometric transformation between the two data sets. These techniques are proving to be an important resource for the application of image analysis, especially in the diagnosis of medical images acquired from different modalities. This can be performed through a registration process, which allows superimposing the information available in different image modalities into a single image.

The purpose of multimodal image registration and fusion is to provide complementary information from different image modalities [2424 Chen YT, Wang MS. Three-Dimensional Reconstruction and Fusion for Multi-Modality Spinal Images. Comput. Med. Imaging Graph. 2004; 28: 21-31.], and the association of multimodal images increases the sensitivity and reliability of the diagnostic aid and provides a more detailed and quantitative study of the different anatomical structures and their physiology at the same time [44 Sanches IJ. [Superposition of thermography and magnetic resonance images: a new tridimensional medical image modality] [thesis]. Curitiba: Universidade Tecnológica Federal do Paraná, Programa de Pós-Graduação em Engenharia Elétrica e Informática Industrial; 2009. 170 p. Brazilian Portuguese.].

In many clinical applications, it is desirable to combine or unify information from different imaging modalities and help in planning, performing, and evaluating surgeries and radiotherapy procedures. Figure 1 represents an example of imaging registration and fusion between MRI vs PET images. This case illustrates the importance of imaging registration to overcome and deal with incorrect diagnosis. As this melanoma metastasis with lesion of the liver was characterized as malignant on the fused PET/MRI images, but were wrongly classified as benign, when looking only on MRI originally (Figure 1 (B)) [2525 Parsai A, Miquel ME, Jan H, Kastler A, Szyszko T, Zerizer I. Improving liver lesion characterization using retrospective fusion of FDG PET/CT and MRI. Clin imaging. 2019; 55: 23-28.].

Figure 1
Example of image registration/fusion between MRI vs PET images. (A) MRI image (T1 weighted) with intravenous gadolinium contrast; and (B) PET image fused with MRI (Adapted from [2525 Parsai A, Miquel ME, Jan H, Kastler A, Szyszko T, Zerizer I. Improving liver lesion characterization using retrospective fusion of FDG PET/CT and MRI. Clin imaging. 2019; 55: 23-28.]).

MATERIAL AND METHODS

In this research, it was used different imaging modalities, which were acquired by the following imaging systems: (1) the infrared images with a FLIR camera model A325, with a resolution of 320 by 240 pixels, temperature range from -20 to 120 ºC, and 0.08 ºC in thermal resolution. (2) The scintigraphy was acquired with a Discovery 530 system, using a CZT solid detector, which enables to use lower levels of radiopharmaceuticals and the resolution of 128 by 128 pixels. (3) The anatomical images were acquired with a computerized tomography (CT) imaging system, Hitachi Scenaria 128 channels, the DICOM imaging slices details are: slice thickness of 1.25 mm, space between slices: 0.468 mm; and the in-plane resolution of the is 512 by 512. (4) Additionally, for the 3D reconstruction, it was used the 3D Slicer open-source software, based on the original DICOM images obtained from the CT system.

Then, It was selected one patient from Hospital das Clínicas (HC-UFPR), Endrocinology Sector (SEMPR), with a Graves’ Disease. This disorder consists of autoimmune disease in the thyroid gland, which is responsible for the overproduction of hormones, mainly related to the thyrotropin receptor antibody (TRAb). Additionally, there is also a growth in its volume, causing hyperthyroidism. In this case study, there is a higher hyper-uptake during the interaction with radiopharmaceuticals being used during the scintigraphy. This study has a Research Ethics Committee approved by the Pontifical Catholic University of Parana with process number: 79980017.0.1001.0020.

Since we are interested in representing an image fusion (overlap) between each two pairs of imaging modalities, such as: infrared images, scintigraphy, and CT anatomic images (either 2D slices or 3D geometry representation) - it is necessary to perform the image registration between such image pairs. The registration method employed here is the affine transformation process, in which the main purpose is to represent both images at the same coordinate system in order to align them [44 Sanches IJ. [Superposition of thermography and magnetic resonance images: a new tridimensional medical image modality] [thesis]. Curitiba: Universidade Tecnológica Federal do Paraná, Programa de Pós-Graduação em Engenharia Elétrica e Informática Industrial; 2009. 170 p. Brazilian Portuguese., 55 Maintz JBA, Viergever MA. A Survey of Medical Image Registration. Med Image Anal. 1998; 2(1): 1-37, 77 Zitová B, Flusser J. Image Registration Methods: A Survey. Image Vis. Comput. 2003; 21: 977-1000].

Our methodology has three main algorithm steps, which are detailed below:

STEP 1 - Manual Points Selection

In the first stage the user manually selects common points/markers in both images in an interactive way. A tool in C++ language was developed for points marker selection.

This process works when at least three common pair of points are selected and evaluated by the visualization of the results obtained, deciding if there is the need for removing or including additional pair of points in both images.

To illustrate, Figure 2 represents the diagram for the pair of points selection, in this case represented by the infrared image and the 2D anatomic CT slice to be common registered and visualized after this imaging overlap.

Figure 1
Diagram representing the manual points selection between two different imaging modalities (infrared image and frontal view 3D CT geometry).

STEP 2 - Affine Registration (alignment)

After the points selection, the application stores the points coordinates, in a binary file from both images. In order to perform the affine registration, it was also developed a code in MATLAB® (The MathWorks, Inc.), based on the imaging processing toolbox, responsible to read the file with the selected markers and to perform the actual registration within the pairs of imagens, enabling the visualization with the overlap for further inspection and adjustments if necessary.

In the example from Figure 2, it was selected five common pairs of points, although it is possible to include more if needed. The suggestion is to choose points placed in the borders or regions easily identifiable in both images. In this example, it was used the infrared image to apply the affine transformation. So, there is also the versatility to choose which image to apply such transformation.

Since it was used the MATLAB for the registration process, it was employed functions: cp2tform and imtransform, to apply the affine transformation registration. For the choice of the best registration process, it was also performed several tests in MATLAB, assessing the best outcome. Then, it was chosen the affine registration, as it was better adapted for the image types that we are dealing with (anatomic and functional modalities). Specially, regarding the necessary geometric transformations to be performed with the initial set of images. In terms of the minimum number of control point pairs, it is necessary to select at least three points in each image. But in this research, it was employed four to six pairs of points.

STEP 3 - Imaging Fusion & Visualization Overlap

This last step is related to the final visualization fusion and imaging overlap, which points out to the transparency of the pair of images being merged.

Therefore, it was employed the alpha channel (α), which is a color component that represents the degree of opacity (or transparency) for a color to each pixel in the merged image (i.e., red, green and blue channels). It is often used to determine how a picture element (pixel) is rendered when blended with another. The blended color can be obtained by the Equation 3, applying to each RGB channel of the images C 1 and C 2 .

C = C 1 * ( 1 α ) + C 2 * α (3)

Where C, C 1 and C 2 are RGB Colors, and α is a floating-point number with a value ranging from 0 to 1. This is performed for each RGB channels. A value of 0 means that the pixel is fully transparent and a value of 1 means that the pixel is fully opaque.

RESULTS

A. Different Imaging Modalities

This research involves the acquisition of images using four distinct imaging modalities, which are: (1) infrared thermography; (2) scintigraphy; (3) computerized tomography images (CT - showing a coronal slice); and (4) frontal view 3D CT geometry/reconstruction of the CT images (showing the overall external and internal geometry, especially the location/position of the thyroid), as shown in Figure 3.

It is worth to mention that first two imaging modalities, thermography and scintigraphy, are functional representations (either in terms of temperature differences or related to the interaction of the radio-isotopes iodine/technetium, which interacts with the metabolic functioning of the thyroid gland itself). While CT related images are anatomical images.

Figure 2
The different medical imaging modalities employed in this research: (a) infrared thermography; (b) scintigraphy; (c) 2D CT coronal slice image; and (d) frontal view 3D CT geometry.

B. Overlap/Image Registration

For the registration process to work properly, it requires the selection of at least three matching common points. Figure 4 illustrates an image pair, representing an infrared image (Figure 4(a)) and a CT image slice (Figure 4(b)) - which are corresponding to the common points’ manual selection in both images.

Although, it is worth mentioning that the manual selection is based on finding selecting points that match with similar anatomical regions, as we do not have additional fiducial markers to be added into the images. As the only image that we could add extra markers would be the infrared image. Unfortunately, at the other modalities (scintigraphy and CT images), we are not allowed to interfere in the imaging acquisition process. Despite the unavailability of using fiducial markers during the CT, since it is not possible to interfere in the CT scan, a registration process was conducted. Thus, a quantification of the error in this registration process was performed, obtaining values between 1.41 and 5 pixels, by calculating the Euclidean distance [2626 Gonzalez RC, Woods RE. Digital Image Processing. 3rd ed. New Jersey: Pearson Prentice Hall; 2008. 976 p.], considering the markers of both images (as illustrated in Figure 4, showing the selected points).

Figure 3
Example of a pair of different imaging modalities, which are being used to perform the registration methodology: (a) Infrared thermography and (b) Frontal view 3D CT geometry. In both cases, it is shown the selection of five common markers (points), used to perform the images registration.

C. Image Visualization Analysis

The next step involves presenting the overlapped imaging. Figure 5 shows the imaging merging between the following: (A) Scintigraphy vs frontal view 3D CT geometry; (B) Scintigraphy vs infrared; (C) Scintigraphy vs 2D imaging slice; (D) Infrared vs frontal view 3D CT geometry; (E) Infrared vs 2D slice; and (F) 3D geometry vs 2D slice.

Figure 4
Results obtained from the overlap between each two image pairs, based on the image registration from these flowing imagens modalities: (a) Scintigraphy vs frontal view 3D CT geometry; (b) scintigraphy vs infrared; (c) Scintigraphy vs 2D imaging slice; (d) Infrared vs frontal view 3D CT geometry; (e) Infrared vs 2D slice; and (f) frontal view 3D CT geometry vs 2D slice.

The transparency can be chosen based on the best visualization purposes. Figure 6 illustrates such variations in the transparency, allowing to perceive the subtle superimposition of both images altogether. In this case, it is represented two different alpha channel values (α = 0.3 and α = 0.6).

Figure 5
Example of visualization of two different transparencies between the registration of scintigraphy and the frontal view 3D CT geometry, with different alpha channels: (a) α = 0.3 and (b) α = 0.6.

DISCUSSION

The developed tool presented in this paper allows to perform all the steps of imaging registration and fusion for the following imaging modalities: infrared, scintigraphy and CT anatomy - with the 2D slices and the frontal view of the 3D CT geometry model. The proposed method allows anatomical and functional 2D/3D images to be superimposed and visualized altogether. Thus, the medical team will obtain both anatomical information (provided by frontal 2D images and 3D reconstruction) to assess the structure/morphology and volume of the gland, while the functional images (scintigraphy and thermography) will allow the observation of the metabolic characteristics. Therefore, the fusion of these two modalities of medical images corroborate to examine both anatomical and functional at the same time and the medical team will be able to verify the regions where the temperature variations are located, through a thermal inspection, also verifying the regions of hyper-uptake, being able to indicate the presence of nodules by means of scintigraphy.

This imaging registration combines information from two different imaging modalities, generating a hybrid image, which allows an interactive visualization. This approach was shown in the work of Chee and Wu [2121 Chee E, Wu Z. AIRNet: Self-Supervised Affine Registration for 3D Medical Images using Neural Networks. ArXiv.2018.], with a self-supervised learning method focused on 3D affine transformation for medical images, aiming specifically on axial view of brain scans. Also, in Hattori and Komiyama [1111 Hattori R, Komiyama T. PatchWarp: Corrections of non-uniform image distortions in two-photon calcium imaging data by patchwork affine transformations. Cell Rep Methods. 2022; 2(5):100205.], it was employed an algorithm to perform a rigid alignment of image frames and later an affine transformation to corrects its distortions.

Although, in terms of a validation of such imaging registration process, it is not trivial; since there is not a gold standard method to achieve similar images as we obtained. Additionally, the functional images (infrared and scintigraphy) have very low spatial resolution, which is a further limitation. Even the literature presents uncertainties about the quantification of the registration errors from different imaging modalities, as presented by Fitzpatrick and coauthors [2727 Fitzpatrick JM. Detecting Failure, Assessing Success. Medical Image Registration (1st ed.). CRC Press. Boca Raton, Florida, USA, 2001.], apart from Viergever et al [88 Viergever MA, Maintz JB, Klein S, Murphy K, Staring M, Pluim JP. A survey of medical image registration - under review. Med Image Anal. 2016; 33: 140-4.]. For example, Goshtasby [1212 Goshtasby AA. Theory and applications of image registration. 1st ed. Wiley; 2017. 484 p.] presents applications of image registration regarding the performance metrics, which according to him have to follow accuracy, reliability, and speed. To present additional examples, regarding medical applications, Parsai and coauthors [2525 Parsai A, Miquel ME, Jan H, Kastler A, Szyszko T, Zerizer I. Improving liver lesion characterization using retrospective fusion of FDG PET/CT and MRI. Clin imaging. 2019; 55: 23-28.] presented several cases showing imaging registration between MRI and PET enhancing diagnosis. Kainz and coauthors [2828 Kainz H, Bale R, Donnemiller E, Gabriel M, Kovacs P, Decristoforo C, Moncayo R. Image fusion analysis of 99m Tc-HYNIC-octreotide scintigraphy and CT/MRI in patients with thyroid-associated orbitopathy: the importance of the lacrimal gland. Eur J Nucl Med Mol Imaging.2003; 30(8): 1155-9.] presented an application related to the lacrimal gland fusion from both CT/MRI (anatomy images) with scintigraphy (functional images). Therefore, all the imaging registration process has inherent errors, which are related to the mathematical transformations performed [55 Maintz JBA, Viergever MA. A Survey of Medical Image Registration. Med Image Anal. 1998; 2(1): 1-37, 88 Viergever MA, Maintz JB, Klein S, Murphy K, Staring M, Pluim JP. A survey of medical image registration - under review. Med Image Anal. 2016; 33: 140-4., 2626 Gonzalez RC, Woods RE. Digital Image Processing. 3rd ed. New Jersey: Pearson Prentice Hall; 2008. 976 p., 2929 Bankman IN. Handbook of Medical Imaging, Processing and Analysis. Academic Press. 2000. 901 p.]. In our case study, it was obtained a registration process error ranging from 1.41 to 5 pixels, which for this case scenario provides a reasonable result for the related endocrinology medical applications. There are obvious limitations, such as: without the availability of using fiducial markers, low spatial resolution of the functional images (infrared and scintigraphy), different imaging modalities/systems, which leads to different anatomical positions. All these should be taken into account for further research.

In this paper, as the inclusion of fiducial markers was not permitted, it was not possible to conduct a reliable accuracy evaluation. However, it was performed an error quantification related to the registration process. Fiducial markers in tomography cannot interfere within the exam. Also, there is no way to use markers in scintigraphy, because in this exam it is only possible to visualize the thyroid gland and its functional parameters (hyper-uptake). Therefore, there is no contour image of the patient's outer bust and head, which means that the markers would not be visualized if placed on the skin of the volunteer. Comparing the accuracy asserted for one method with that of another one, often proves challenging due to inherent methodological discrepancies. According to Bankman [3030 Meller J, Becker W. The continuing importance of thyroid scintigraphy in the era of high-resolution ultrasound. Eur J Nucl Med Mol Imaging. 2002; 29(2): S425-38.], similar to our research, not having gold standards establishments, verifying true accuracy with this level of certainty presents a real challenge. Although, it’s possible to perform error quantification.

While thermography measures the infrared radiation that is emitted by the thyroid during its physiological and metabolic process, scintigraphy can identify how hot or cold nodules are working. When hot focal nodules are present in the patient's gland, it can be observed that there is a high uptake of iodine [3030 Meller J, Becker W. The continuing importance of thyroid scintigraphy in the era of high-resolution ultrasound. Eur J Nucl Med Mol Imaging. 2002; 29(2): S425-38.]. Okosieme and coauthors [3131 Okosieme OE, Chan D, Price SA, Lazarus JH, Premawardhana LD. The utility of radioiodine uptake and thyroid scintigraphy in the diagnosis and management of hyperthyroidism. Clin Endocrinol (Oxf). 2010; 72(1): 122-7.] demonstrates that patients with high uptake seem to be more resistant to the radiopharmaceutical used in the exam. It is important to note that when there are hyperactive nodules, the region has increased chemical activity and blood flow [3232 Helmy A, Holdmann M, Rizkalla. Application of thermography for non-invasive diagnosis of thyroid gland disease. IEEE Trans Biomed Eng. 2008; 55: 1168-75.].

In the literature, it is also found some papers that demonstrate how the use of IR thermography to generate functional images of the gland is important for endocrinologists, especially in cases of different abnormalities (e.g., hyperthyroidism, thyroid cancer, nodules). For example, Helmy and coauthors [3232 Helmy A, Holdmann M, Rizkalla. Application of thermography for non-invasive diagnosis of thyroid gland disease. IEEE Trans Biomed Eng. 2008; 55: 1168-75.] worked on thyroid infrared image acquisition and processing, in order to extract useful features for assisting artificial intelligence algorithms for a cancer diagnosis, which was later compared with scintigraphy images (by Iodine 123).

Also, for hyperthyroidism patterns, another study assessed numerical simulations with experimental measurements, to help a better understanding about physiological aspects of the gland. The basis of the method was generating 3D Finite Element Modeling (FEM) along with magnetic resonance imaging which were compared with the IR images [3333 Jin C, He ZZ, Yang Y, Liu J. MRI-based three-dimensional thermal physiological characterization of thyroid gland of human body. Med Eng Phys. 2014; 36: 16-25.]. The literature demonstrated that IR thermography images could be used as an alternative for a safe and reliable diagnosis in cases of hyperthyroidism [3434 Rossato M, Burei M, Vettor R. Neck thermography in the differentiation between diffuse toxic goiter during methimazole treatment and normal thyroid. Endocrine. 2015; 48(3): 1016-7.].

These references point to technological advances in medicine, helping to understand thyroid gland physiology in greater depth, as a previous registration made by Souza and coauthors [11 De Souza MA, Bueno AP, Magas V, Nogueira Neto GN, Nohama P. Imaging Fusion Between Anatomical and Infrared Thermography of Thyroid Gland. GMEPE/PAHCE 2019; 1-5.] for a different thyroid disease. Techniques such as imaging registration and fusion involving different categories of medical images (anatomical from MRI and CT; IR thermography with temperature changes; scintigraphy) help specialists to assess the patient's thyroid dysfunctions from different angles, improving treatment tracking.

In this research, the focus is given to evaluate the Graves’ disease, which has already higher hyper-uptakes related with the radiopharmaceuticals from scintigraphy imaging [99 Günay O, Sarihan M, Yarar O, Akkurt İ, Demir M. Measurement of Radiation Dose in Thyroid Scintigraphy. Acta Phys. Pol. A. 2020; 137(4): 569-73.]. Additionally, another finding in this paper is related to the common features among all the imaging modalities (scintigraphy, infrared, and CT anatomy - slice and frontal view 3D CT geometry), as it was found a perfect matching from the image registration obtained here (Figure 5). Especially, regarding the higher temperatures from the infrared image, which corresponds to the higher uptake from scintigraphy.

Our work presents the fusion of anatomical images obtained by CT combined with two functional imaging modalities (thermography and scintigraphy). Thus, it is possible to observe functional information along with the anatomical characteristics - enabling the identification of the presence of nodules, through temperature variation and/or increased hyper-uptake. For this reason, both endocrinologists and patients will benefit from a safer medical opinion and greater diagnostic sensitivity both for diagnosis and monitoring the evolution and treatment of the disease.

CONCLUSION

The new methodology presented here allows to perform medical imaging registration and fusion, providing multimodal images, resulting into new 2D imaging categories, which combines information from different imaging modalities altogether into a unique new image. Therefore, it helps medical diagnosis and inspection within complementary information, in an interactive way from other single images, which are: infrared images, scintigraphy, 2D CT slices and 3D CT geometry. To the best of our knowledge, we have not found similar studies in the literature applying registration methods focused in employing both anatomical and functional medical images, which were acquired within different imaging systems, using different anatomic positions for imaging acquisition of the patient and not simultaneously. For all registration and image fusion there is an inherent error, no matter how robust and accurate the systems are. Additionally, there are no fiducial markers to be placed and the functional images provide low spatial resolution. However, even with these limitations, it was possible to generate reasonable results by applying the registration between pairs of images from different modalities. Therefore, this work opens up further exploration in registration methods.

The case study presented in this work showed a volunteer with Graves' disease, presenting results including different imaging modalities, which was possible to analyze and inspect the disorder, bringing strong evidence of future clinical evaluations.

We emphasize that the development of new medical imaging technologies is crucial for the diagnosis and follow-up of the patient. Such approach occurs more effectively, reducing the unsureness of the medical team and bringing more safety to the patient during the treatment. Therefore, the use of a combination of different imaging modalities demonstrates how technology and medicine are great allies for the treatment of the most diverse diseases.

Acknowledgments

We are thankful to the endocrinologists, Dra Gisah Amaral de Carvalho, Dr Cléo Otaviano Mesa Junior and Dra Isabelle Gasparetto Leite, for all the support with the patients during this research.

REFERENCES

  • 1
    De Souza MA, Bueno AP, Magas V, Nogueira Neto GN, Nohama P. Imaging Fusion Between Anatomical and Infrared Thermography of Thyroid Gland. GMEPE/PAHCE 2019; 1-5.
  • 2
    Ring EF, Ammer K. Infrared thermal imaging in medicine. Physiol Meas. 2012; 33: 33-46.
  • 3
    Dufour DR. Laboratory Tests of Thyroid Function: Uses and Limitations. Endocrinol Metab Clin North Am. 2007; 36(3): 579-94.
  • 4
    Sanches IJ. [Superposition of thermography and magnetic resonance images: a new tridimensional medical image modality] [thesis]. Curitiba: Universidade Tecnológica Federal do Paraná, Programa de Pós-Graduação em Engenharia Elétrica e Informática Industrial; 2009. 170 p. Brazilian Portuguese.
  • 5
    Maintz JBA, Viergever MA. A Survey of Medical Image Registration. Med Image Anal. 1998; 2(1): 1-37
  • 6
    Modersitzki J. Numerical Methods for Image Registration. Oxford University Press Inc. New York, USA, 2004
  • 7
    Zitová B, Flusser J. Image Registration Methods: A Survey. Image Vis. Comput. 2003; 21: 977-1000
  • 8
    Viergever MA, Maintz JB, Klein S, Murphy K, Staring M, Pluim JP. A survey of medical image registration - under review. Med Image Anal. 2016; 33: 140-4.
  • 9
    Günay O, Sarihan M, Yarar O, Akkurt İ, Demir M. Measurement of Radiation Dose in Thyroid Scintigraphy. Acta Phys. Pol. A. 2020; 137(4): 569-73.
  • 10
    Chang C, Chung P, Hong Y, Tseng C. A Neural Network for Thyroid Segmentation and Volume Estimation in CT Images. IEEE Comput Intell Mag. 2011;6(4):43-55.
  • 11
    Hattori R, Komiyama T. PatchWarp: Corrections of non-uniform image distortions in two-photon calcium imaging data by patchwork affine transformations. Cell Rep Methods. 2022; 2(5):100205.
  • 12
    Goshtasby AA. Theory and applications of image registration. 1st ed. Wiley; 2017. 484 p.
  • 13
    Azam MA, Khan KB, Ahmad M, Mazzara M. Multimodal Medical Image Registration and Fusion for Quality Enhancement. CMC. 2021; 68: 821-40.
  • 14
    Shi Y, Jiang X, Li S. Fusion algorithm of UAV infrared image and visible image registration. Soft Comput. 2021.
  • 15
    Li D, Chen L, Bao W, Sun J, Ding B, Li Z. An improved image registration and fusion algorithm. Wireless Netw. 2021;27:3597-611.
  • 16
    Haskins G, Kruger U, Yan P. Deep learning in medical image registration: a survey. Mach Vis Appl. 2020; 31(8): 1-18.
  • 17
    González JR, Rodrigues ÉO, Damião CP, Fontes CAP, Silva AC, Paiva AC, Li H, Du C, Conci A. An Approach for Thyroid Nodule Analysis Using Thermographic Images. Series in BioEngineering Application of Infrared to Biomedical Sciences. 2017; Chapter 26: 451-75.
  • 18
    Brown, LG. A survey of image registration techniques. ACM Comput Surv. 1992; 24(4): 325-376.
  • 19
    Shusharina N, Heinrich MP, Huang R. Segmentation, Classification, and Registration of Multi-modality Medical Imaging Data. MICCAI 2020 Challenges, ABCs 2020, L2R 2020, TN-SCUI 2020, Held in Conjunction with MICCAI 2020, Lima, Peru, October 4-8, 2020. Proceedings de Lecture Notes in Computer Science Image Processing, Computer Vision, Pattern Recognition, and Graphics, Springer Nature, 2021; Vol. 12587.
  • 20
    González JR, Pupo Y, Hernandez M, Conci A, Machenry T, Fiirst W. On Image Registration for Study of Thyroid Disorders by Infrared Exams. Proceedings of the International Conference on Image Processing, Computer Vision, and Pattern Recognition; Athens, 2018.
  • 21
    Chee E, Wu Z. AIRNet: Self-Supervised Affine Registration for 3D Medical Images using Neural Networks. ArXiv.2018.
  • 22
    Van Den Elsen PA, Pol E-JD, Viergever MA. Medical Image Matching - A Review with Classification. IEEE Eng Med Biol. 1993; 12(1): 26-39.
  • 23
    Rueckert D. Nonrigid Registration: Concepts, Algorithms, and Applications, In: Medical Image Registration. CRC Press. 2001. 281-302
  • 24
    Chen YT, Wang MS. Three-Dimensional Reconstruction and Fusion for Multi-Modality Spinal Images. Comput. Med. Imaging Graph. 2004; 28: 21-31.
  • 25
    Parsai A, Miquel ME, Jan H, Kastler A, Szyszko T, Zerizer I. Improving liver lesion characterization using retrospective fusion of FDG PET/CT and MRI. Clin imaging. 2019; 55: 23-28.
  • 26
    Gonzalez RC, Woods RE. Digital Image Processing. 3rd ed. New Jersey: Pearson Prentice Hall; 2008. 976 p.
  • 27
    Fitzpatrick JM. Detecting Failure, Assessing Success. Medical Image Registration (1st ed.). CRC Press. Boca Raton, Florida, USA, 2001.
  • 28
    Kainz H, Bale R, Donnemiller E, Gabriel M, Kovacs P, Decristoforo C, Moncayo R. Image fusion analysis of 99m Tc-HYNIC-octreotide scintigraphy and CT/MRI in patients with thyroid-associated orbitopathy: the importance of the lacrimal gland. Eur J Nucl Med Mol Imaging.2003; 30(8): 1155-9.
  • 29
    Bankman IN. Handbook of Medical Imaging, Processing and Analysis. Academic Press. 2000. 901 p.
  • 30
    Meller J, Becker W. The continuing importance of thyroid scintigraphy in the era of high-resolution ultrasound. Eur J Nucl Med Mol Imaging. 2002; 29(2): S425-38.
  • 31
    Okosieme OE, Chan D, Price SA, Lazarus JH, Premawardhana LD. The utility of radioiodine uptake and thyroid scintigraphy in the diagnosis and management of hyperthyroidism. Clin Endocrinol (Oxf). 2010; 72(1): 122-7.
  • 32
    Helmy A, Holdmann M, Rizkalla. Application of thermography for non-invasive diagnosis of thyroid gland disease. IEEE Trans Biomed Eng. 2008; 55: 1168-75.
  • 33
    Jin C, He ZZ, Yang Y, Liu J. MRI-based three-dimensional thermal physiological characterization of thyroid gland of human body. Med Eng Phys. 2014; 36: 16-25.
  • 34
    Rossato M, Burei M, Vettor R. Neck thermography in the differentiation between diffuse toxic goiter during methimazole treatment and normal thyroid. Endocrine. 2015; 48(3): 1016-7.
  • Funding:

    This research was funded by CNPq (Conselho Nacional de Desenvolvimento Científico e Tecnológico), from Brazil, grant number 432038/2016-7.

Edited by

Editor-in-Chief:

Paulo Vitor Farago

Associate Editor:

Paulo Vitor Farago

Publication Dates

  • Publication in this collection
    29 Apr 2024
  • Date of issue
    2024

History

  • Received
    20 Sept 2022
  • Accepted
    08 Nov 2023
Instituto de Tecnologia do Paraná - Tecpar Rua Prof. Algacyr Munhoz Mader, 3775 - CIC, 81350-010 Curitiba PR Brazil, Tel.: +55 41 3316-3052/3054, Fax: +55 41 3346-2872 - Curitiba - PR - Brazil
E-mail: babt@tecpar.br