Sunteți pe pagina 1din 19

International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014 39

A Freehand 3D Ultrasound
Imaging System using
Open-Source Software
Tools with Improved Edge-
Preserving Interpolation
Mohammad I. Daoud, Department of Computer Engineering, German Jordanian University,
Amman, Jordan
Abdel-Latif Alshalalfah, Department of Computer Engineering, German Jordanian
University, Amman, Jordan
Falah Awwad, Electrical Engineering Department, United Arab Emirates University, Al-Ain,
UAE
Mahasen Al-Najar, Department of Diagnostic Radiology, The University of Jordan Hospital,
Amman, Jordan

ABSTRACT
Ultrasound imaging is widely employed in various medical procedures. Most ultrasound procedures are
performed with conventional 2D ultrasound systems, but visualizing the 3D anatomy using 2D ultrasound
images is often challenging. This paper describes the use of open-source software tools to develop a freehand
system for synthesizing high-quality 3D ultrasound volumes using electromagnetic tracking. In the proposed
system, the spatial transformation between the 2D ultrasound images and the electromagnetic sensor attached
to the ultrasound transducer was performed using an accurate spatial calibration method. A new interpolation
method, called the edge-preserving distance-weighted (EPDW), is employed to reconstruct the 3D ultrasound
volumes. The performance of the system is evaluated by performing a set of phantom experiments. The results
showed that the reconstructed 3D ultrasound volumes have sub-millimeter accuracy. Moreover, the ultrasound
volumes synthesized using the EPDW method demonstrated improved edge preservation compared with a
previous interpolation method.

Keywords:
Freehand Ultrasound, Graphics Processing Unit, ParaView Software, Scilab Software Package, Three-
Dimensional Ultrasound, Visualization Toolkit Library, Volume Reconstruction

DOI: 10.4018/IJOSSP.2014070103

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
40 International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014

INTRODUCTION
Ultrasound imaging is commonly used in various medical diagnosis and procedures, such as
cancer screening (Liao et al., 2011) and image-guided intervention (Daoud et al., 2011; Unsgard,
2009). Compared with other imaging modalities, such as computed tomography (CT) and mag-
netic resonance imaging (MRI), ultrasound offers the advantages of low cost, noninvasiveness,
portability, and real-time imaging capability. The majority of conventional ultrasound imaging
systems is equipped with two-dimensional (2D) transducers that cannot provide three-dimensional
(3D) volume data of the tissue. Using 2D ultrasound imaging, the user is required to perform
mental analysis and integration of several 2D ultrasound images to obtain a 3D impression
about the scanned anatomy. Moreover, conventional 2D ultrasound systems do not support the
acquisition of 2D images parallel to the skin. To overcome these limitations, 3D ultrasound
imaging systems have been developed to enable 3D volume data acquisition (Wen et al., 2013;
Scheipers et al., 2010).
Several methods have been proposed for enabling 3D ultrasound imaging (Bax et al., 2008;
Wen et al., 2013; Oralkan et al., 2003). In general, these methods can be classified into three groups
(Fenster et al., 2011). In the first group, ultrasound volume imaging of the tissue is performed
using a 3D transducer that is composed of a 2D array of transducer elements. The transducer
elements are controlled to electronically steer the ultrasound beam across a volume of interest
(VOI) to acquire a 3D ultrasound volume in real time (Oralkan et al., 2003). In general, these
3D ultrasound systems involve high-costs and are not commonly available.
In the second group of 3D ultrasound imaging systems, a motorized mechanical apparatus
is employed to rotate, translate, or tilt a 2D ultrasound transducer along a predefined path to
acquire a sequence of 2D ultrasound images across a VOI. This motorized motion of the 2D
transducer is precisely controlled, and hence the 3D position and orientation of each acquired
2D ultrasound image can be determined with high accuracy. A 3D ultrasound volume can be
reconstructed by processing the acquired 2D ultrasound images along with their 3D positions
and orientations (Bax et al., 2008). One limitation of these 3D ultrasound imaging systems is
the size of the reconstructed ultrasound volume, which is restricted by the motion range of the
motorized mechanical apparatus. In addition, the ultrasound probe employed in these 3D systems
integrates both the 2D ultrasound transducer and the motorized mechanical apparatus, and hence
such a probe is considered bulky and its use in some clinical procedures might not be convenient.
The last group of 3D ultrasound imaging systems is based on freehand ultrasound imaging.
In this approach, the capability of a conventional 2D ultrasound imaging system is expanded
to enable the reconstruction of 3D ultrasound volumes (Wen et al., 2013). In particular, the 3D
motion of the conventional 2D ultrasound transducer is monitored using a tracking system.
The operator freely moves the conventional 2D ultrasound transducer in unrestricted manner
over the target anatomy to acquire a sequence of 2D ultrasound images. The 3D positions and
orientations of each acquired 2D ultrasound image are computed based on the tracked motion
of the transducer. Finally, a 3D ultrasound volume is reconstructed by interpreting the acquired
2D ultrasound images along with their computed 3D positions and orientations.
Freehand ultrasound is considered a cost-effective technique for 3D ultrasound imaging.
Moreover, freehand ultrasound offers several advantages, including the flexibility of visualizing
the anatomy from different views and the ability to increase the density of the acquired ultrasound
images in important areas (Gooding et al., 2008). Due to these advantages, freehand ultrasound
has gained increasing popularity in both diagnostic and therapeutic procedures.
This paper describes the use of open-source software to develop a freehand 3D ultrasound
imaging system for synthesizing high-quality 3D ultrasound volumes. In this freehand ultrasound

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014 41

system, an electromagnetic sensor is attached to a conventional 2D ultrasound transducer to


track its motion with six degrees of freedom (6DoF). In order to estimate the 3D positions and
orientations of the 2D ultrasound images based on the tracked motion of the transducer, spatial
calibration is performed to transform the coordinate system of the 2D ultrasound images to the
coordinate system of the electromagnetic sensor. The spatial calibration is carried out based the
method proposed by Chen et al. (2009), which was originally developed for calibrating optically-
tracked ultrasound transducers. A new interpolation method, called the edge-preserving distance-
weighted (EPDW), is presented to map the pixels of the acquired 2D ultrasound images to the
voxels of the reconstructed 3D ultrasound volume. The EPDW method enables both effective
speckle reduction and tissue boundary enhancement in the reconstructed 3D ultrasound volumes.
The performance of the EPDW method is compared with a previous interpolation method, called
the adaptive squared-distance-weighted (ASDW). The computational operations of the proposed
freehand imaging system are implemented using the Scilab open-source numerical computation
package (Scilab, 2014) and the C programming language. Moreover, system is run on a graph-
ics processing unit (GPU) to reduce its execution time. The visualization of the synthesized
ultrasound volumes is performed using the VTK open-source library, the ParaView open-source
application, and the MITK open-source package. The accuracy of the proposed system is evalu-
ated by imaging a cross-wire phantom with known dimensions. Moreover, the system is used
to synthesis a 3D ultrasound volume for a tissue mimicking phantom to evaluate the ability of
the proposed EPDW interpolation method to suppress speckle and preserve tissue boundaries.
The rest of the paper is organized as follows: The related work section provides a sum-
mary of previous studies related to the proposed freehand 3D ultrasound imaging system. The
materials and methods section describes the system overview and the freehand 3D ultrasound
imaging procedure. Moreover, this section presents the EPDW interpolation method as well as
the implementation of the 3D ultrasound volume reconstruction and visualization. The last three
sections provide the performance evaluation, the discussion, and the conclusion and future work.

RELATED WORK
This section provides a summary of previous studies related to three important components of the
proposed freehand 3D ultrasound imaging system. These components are the estimation of the
3D positions and orientations of the acquired 2D ultrasound images, the interpolation of the 2D
ultrasound images to reconstruct a 3D ultrasound volume, and the use of open source software
in freehand 3D imaging systems.

Estimating the Positions and Orientations of the 2D Ultrasound Images

Several approaches have been proposed for estimating the 3D positions and orientations of the
acquired 2D ultrasound images. For example, the relative locations and orientations between
the 2D ultrasound images can be computed based on image-based sensing derived from speckle
analysis of the ultrasound images, without the use of position tracking systems (Afsham et al.,
2014; Gee et al., 2006). However, the accuracy of the spatial measurements obtained using
image-based sensing might be limited. Other approaches employ external tracking systems,
such as optical tracking (Treece et al., 2003), electromagnetic tracking (Wen et al., 2013), and
articulated arms (Geiser et al., 1982), to measure the 3D motion of the transducer. In fact, the
electromagnetic tracking systems provide an effective approach to estimate the 3D locations of
the acquired ultrasound images due to their relatively low cost and high tracking accuracy. Due
to these advantages, electromagnetic tracking is the most common tracking approach that is
employed in freehand 3D ultrasound imaging systems (Fenster et al., 2011).

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
42 International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014

Ultrasound Volume Reconstruction

Another important component of freehand 3D ultrasound imaging is the interpolation of the pixels
in the spatially-tracked, irregularly-spaced 2D ultrasound images to reconstruct a 3D ultrasound
volume. Several interpolation methods, such as the methods of Huang & Zheng (2008), Huang
& Zheng (2006), Huang et al. (2005), Prager et al. (1999), and Zhang et al. (2004), have been
introduced to address this problem. In general, these methods can be divided into three groups:
voxel nearest-neighbor (VNN) interpolation, pixel nearest-neighbor (PNN) interpolation, and
distance-weighted (DW) interpolation. In VNN interpolation, each voxel in the reconstructed
3D ultrasound volume is assigned the intensity of the nearest pixel in the 2D ultrasound images.
The VNN interpolation was employed in (Prager et al., 1999) to directly synthesize re-sliced
ultrasound images along arbitrary planes from the acquired 2D ultrasound images. Although the
VNN methods are considered fast, they introduce reconstruction errors when the gaps between
the acquired 2D ultrasound images are large.
The PNN interpolation usually involves two phases: the bin-filling phase and the hole-filling
phase (Rohling et al., 1997). In the bin-filling phase, the intensity of each pixel in the 2D ultra-
sound images is assigned to the nearest voxel in the 3D ultrasound volume. The contributions of
multiple pixels to a single voxel are averaged. In the hole-filling phase, the empty voxels in the
ultrasound volume are identified and filled using local neighborhood averaging. One limitation
of the PNN interpolation is the visible boundaries between the voxels assigned directly by the
2D ultrasound images during the bin-filling phase and the interpolated voxels computed during
the hole-filling phase.
The DW interpolation traverses each voxel in the reconstructed 3D ultrasound volume and
defines a neighborhood region around it. Each pixel located within the neighborhood region is
weighted by its inverse distance to the voxel center. The voxel intensity is set to the weighted
average of the pixels located within the neighborhood region. The DW interpolation enables
effective reduction of the ultrasound speckle and acoustic shadowing artifacts. However, the
averaging operation employed by this interpolation method blurs the image details and degrades
tissue boundaries. To address the blurring effect, the squared-distance-weighted (SDW) method
was proposed by Huang and Zheng (2005), in which each pixel in the neighborhood region is
weighted based on the square of inverse distance between the pixel and the voxel center. In a
later study (Huang and Zheng, 2006), Huang and Zheng have proposed an adaptive algorithm to
adjust the weight distribution of the SDW method based on the first-order statistics of the pixels
located within the neighborhood region. This adaptive algorithm for the SDW interpolation,
which is called the adaptive squared-distance-weighted (ASDW) interpolation, achieved good
results compared with the conventional DW interpolation. However, the ASDW method does
not completely prevent tissue boundary blurring in the reconstructed 3D ultrasound volumes.

Open-Source Software for Freehand 3D Ultrasound Imaging

Several open-source software tools have been used to support the reconstruction and visualization
of the 3D ultrasound volumes synthesized using freehand 3D ultrasound imaging systems. In fact,
the use and development of these open source tools agree with the recommendations of a recent
Nature paper (Ince et al., 2012), which suggested that open source and code release are crucial
for achieving full benefit from human’s technical capabilities. For example, the Visualization
Toolkit (VTK, Schroeder et al., 2003), which is an open-source software library that supports
3D computer graphics and visualization, has been widely used for visualizing and processing
freehand 3D ultrasound volumes. In the study of Boisvert et al. (2008), an open-source soft-

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014 43

ware tool, called SynchroGrab, was developed using the VTK library to enable the acquisition
of tracked ultrasound data and support the reconstruction of ultrasound volumes based on the
acquired images. However, SynchroGrab works only with a limited number of tracking devices
and ultrasound systems. The Medical Imaging Interaction Toolkit (MITK, 2014), which is an
open-source package that supports medical image analysis and visualization, can be employed
for visualizing and processing the 3D ultrasound volumes. ParaView (ParaView, 2014) is another
open-source application for analyzing and visualizing 3D ultrasound volumes. This application
can run on distributed memory computing platforms to enable the analysis and visualization of
large datasets.

MATERIALS AND METHODS


System Overview

Figure 1 shows the system overview of the proposed freehand 3D ultrasound imaging system.
The system is composed of an ultrasound imaging system (SonixTouch Q+, Ultrasonix Inc.,
Richmond, BC, Canada), a 3D electromagnetic tracking system (trakSTAR, NDI, Ontario,
Canada), and a computer workstation. The ultrasound imaging system employed an L14-5/38
linear transducer that has a center frequency of 7.2 MHz. Moreover, the ultrasound system was
configured to acquire ultrasound images at two frame rates: 24 and 30 frames per second. The
acquired 2D ultrasound images were recorded in the form of a digital video, which was manu-
ally transferred from the ultrasound system to the computer workstation. The pixel spacing of
the 2D ultrasound images is equal to 0.11×  0.11mm 2 . The 3D movement of the transducer was
tracked by attaching one of the sensors of the electromagnetic tracking system to the transducer.
The tracking system was configured to record the 3D position and orientation of the sensor with
respect to the electromagnetic transmitter of the tracking system at a sampling rate of 750 Hz.
This sampling rate enables improved tracking accuracy by averaging the position readings over
a predefined time window such that the rate of the averaged position data matches the frame rate
of the ultrasound imaging system. A custom-made software tool was developed using the C
language to transfer the spatial data of the sensor from the control unit of the tracking system to
the computer workstation via a USB connection. The computer workstation was equipped with
open-source software tools for performing the 3D ultrasound volume reconstruction and visu-
alization. The details of these tools are described in the following sections.

Freehand 3D Ultrasound Imaging Procedure

As discussed in the Introduction section, freehand 3D ultrasound imaging is performed by


sweeping the ultrasound transducer over a VOI to acquire a sequence of 2D ultrasound images
and tracking the 3D positions and orientations of the acquired 2D images. The spatial tracking
is achieved in this study by attaching an electromagnetic sensor to the ultrasound transducer.
Tracking the movement of the transducer in the 3D space does not directly map the pixels in the
2D ultrasound images to the voxels in the reconstructed 3D ultrasound volume. In fact, spatial
calibration, spatial transformation, and temporal calibration are required to map the positions
of the individual pixels in each 2D ultrasound image to the 3D coordinate system of the recon-
structed ultrasound volume.
The locations of the pixels in the 2D ultrasound image are described using the logical 2D
coordinate system (i, j ) that is illustrated in Figure 2. Each 2D ultrasound image has a logical

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
44 International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014

Figure 1. The freehand 3D ultrasound system consists of an ultrasound imaging system with a
conventional 2D tranceducer, an electromagnetic tracking system, composed of trasmitter, sen-
sor, and control unit, and a computer workstation with open-source software to perform the 3D
volume reconstruction and visualization.

size of N i ×N j pixels and a uniform pixel spacing of Px × Py mm2. Therefore, each 2D ultra-
sound image corresponds to a 2D plane in the 3D space with a physical size of Lx ×Ly mm2,
where Lx =N i .Px and Ly =N j .Py . The logical location (i, j ) of a given pixel in the ultrasound
image corresponds to a 3D physical location of (i.Px , j .Py ,0) , where the z dimension is set to
0. In the rest of the paper, the 3D coordinate system of the ultrasound image is used to describe
the 3D physical locations of the pixels of the 2D ultrasound image.
The reconstruction of the 3D ultrasound volume involves four 3D coordinate systems, as
described in Figure 3. These coordinate systems are that of the ultrasound image plane (U), that
of the electromagnetic sensor attached to the transducer (S), that of the electromagnetic transmit-
ter (T), and that of the reconstructed ultrasound volume (V). The location of a pixel in the 3D
coordinate system of the ultrasound image, denoted as PU , can be transformed to the correspond-
ing location in the 3D coordinate system of the restructured ultrasound volume, PV , as shown
in Equation (1):

PV = TTVTSTTUS PU (1)

where TUS , TST , and TTV are the transformation matrices from the coordinate system of ultrasound
image plane U to that of the sensor S, the coordinate system of the sensor S to that of the elec-
tromagnetic transmitter T, and the coordinate system of the electromagnetic transmitter T to that
of the reconstructed ultrasound volume V, respectively.

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014 45

Figure 2. A 2D ultrasound image with a size of Ni × Nj pixels and a logical 2D coordinate sys-
tem (i,j). The image corresponds to a 2D plane with a physical size of Lx × Ly mm2, where Lx=
Ni.Px, Ly= Nj.Py, and Px and Py are the pixel spacings along the i and j dimensions, respectively.

The transformation matrix TUS is unknown and can be found using spatial calibration. Chen
et al. (2009) presented an automatic spatial calibration method for freehand 3D ultrasound imag-
ing systems. The method involves the use of the freehand system to scan a double-N wire
phantom with known structure and analyzing the 2D ultrasound images and the 3D spatial loca-
tions of the ultrasound transducer acquired during phantom scanning. The formulation of this
calibration method was originally designed for freehand 3D ultrasound imaging systems that
employ optical tracking to measure the 3D position and orientation of the ultrasound transducer.
In the current study, this calibration method has been adapted to enable the calibration of freehand
3D ultrasound imaging systems that use electromagnetic tracking. The double-N wire phantom
used in this study is composed of an open-ended box with dimensions of 100 mm × 40 mm ×
34 mm. The phantom included upper and lower layers of N-wires separated by 10 mm. In each
layer of N-wires, the distance between the two parallel wires was set to 25 mm. The adapted
calibration method was applied to compute the transformation matrix TUS .
The transformation matrix TST , which represents the transformation between the electro-
magnetic sensor S and the transmitter T, is obtained from the 3D position readings of the sensor
S, which are provided by the electromagnetic tracking system. The transformation TTV is an
alignment transformation that enables customized representation of the reconstructed ultrasound
volume. It is worth noting that the transformations TUS and TTV remain constant throughout the
ultrasound volume reconstruction process, while the transformation TST varies with the motion
of the transducer.
Another important step in freehand 3D ultrasound imaging is to synchronize the sequence
of acquired 2D ultrasound images with the recorded 3D position data of the electromagnetic sen-
sor. This synchronization is called temporal calibration. In this study, the temporal calibration is
performed at the beginning of ultrasound imaging by quickly shaking the ultrasound transducer,
along with the attached sensor, up and down. This quick motion is detected in the sequence of 2D
ultrasound images by performing correlation analysis between each pair of consecutive ultrasound
images to quantify the relative displacement between these images. The ultrasound image pair
with the highest relative displacement is identified. Similarly, the recorded 3D position readings

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
46 International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014

Figure 3. Four coordinate systems are employed in the 3D ultrasound volume reconstruction:
the ultrasound image plane coordinate system (U), the electromagnetic sensor coordinate system
(S), the electromagnetic transmitter coordinate system (T), and the reconstructed ultrasound
volume coordinate system (V).

of the sensor are analyzed to identify the point with the highest special displacement. The point
of the highest displacement in the sequence of ultrasound images is matched with the point of
highest displacement in the recorded position readings of the sensor.
After performing the spatial and temporal calibrations, the 2D ultrasound images are mapped
to the coordinate system of the reconstructed ultrasound volume using Equation (1). The origin
and size of the reconstructed ultrasound volume are determined using the bounding box technique.
In this technique, the minimum values of the x -, y -, and z -dimensions, denoted as X min , Ymin ,
Z min , across all 2D ultrasound images are computed. Similarly, the maximum values of the three
dimensions, denoted as X max , Ymax , Z max , across the 2D ultrasound images are determined.
The reconstructed ultrasound volume can be defined by the two corner points ( X min , Ymin ,
Z min ) and ( X max , Ymax , Z max ) , as illustrated in Figure 4. The voxel spacing along each
Cartesian dimension in the reconstructed 3D ultrasound volume is set to 0.2 mm.
The acquired 2D ultrasound images are arbitrarily located in the 3D space. Therefore, the
pixel intensities of the 2D ultrasound images must be interpolated to compute the intensities
of the regularly-spaced voxels in the reconstructed 3D ultrasound volume. A new interpolation
method, called the edge-preserving distance-weighted (EPDW) method, is proposed in this study
to compute the voxel intensities.

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014 47

Figure 4. The reconstruction of the 3D ultrasound volume using the bounding box technique

The Proposed Edge-Preserving Distance-


Weighted (EPDW) Interpolation Method
The EPDW method improves the performance of the distance-weighted (DW), squared-distance-
weighted (SDW), and adaptive squared-distance-weighted (ASDW) methods to enable effective
suppression of ultrasound speckle and preservation of tissue boundaries. Therefore, these three
methods are reviewed before describing the EPDW method.
The DW interpolation method (Rohling, 1999) provides an effective technique for comput-
ing the intensities of the voxels in the reconstructed 3D ultrasound volume. In this approach, a
neighborhood region is defined around each voxel, and the weighted average intensity of the
pixels located within this region is assigned to the voxel. The weight of a given pixel within
the neighborhood region is equal to the inverse distance between the center of the pixel and the
center of the voxel. The DW interpolation offers the advantages of reducing ultrasound speckle,
but it blurs tissue boundaries.
To reduce the blurring effect of the DW method, Huang et al. (2005) proposed an improved
interpolation method, called the squared-distance-weighted (SDW). In this method, the inten-
sity of a particular voxel in the 3D ultrasound volume is computed by defining a spherical
neighborhood region R with a predefined radius and a center located at the voxel. Each pixel in
R is weighted based on the square of the inverse distance between the pixel and the center of
the region (i.e. the voxel center), as illustrated in Figure 5. Using this method, the intensity,
 
( )
I VV , of a voxel that is located at a volume coordinate VV can be expressed as (Huang et al.,
2005):


 ∑
M
wk I Pvk ( ),w
( )
I VV =
k =o
k
=
1
(2)
(d + α)
M 2
∑ k =o
wk k

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
48 International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014

Figure 5. A 2D illustration of the SDW method. The intensity of a voxel in the ultrasound volume
is computed by defining a spherical neighborhood region R around the voxel. Each pixel within
R is weighted based on the square of the inverse distance between the pixel and the voxel. The
weighted average intensity of the pixels inside R is assigned to the voxel.

where M is the number of pixels located within the region R surrounding the voxel, I Pvk is

( )
the intensity of the kth pixel located at a volume coordinate Pvk within R, wk is the weight of
  
the pixel located at Pvk , dk is the distance between VV and Pvk , and α is a constant for con-
trolling the weight distribution of the pixels within R. When a large value of α is used, the SDW
method provides strong smoothing, and hence enables effective speckle reduction. On the other
hand, when α is small, the SDW method preserves tissue boundaries, but achieves limited
speckle suppression. Hence, the SDW method can be configured using α to either suppress
speckle without preserving the edges, or preserve tissue boundaries but with limited speckle
reduction.
The ASDW method (Huang & Zheng, 2006) was introduced to improve the operation of
the SDW method to achieve both speckle reduction and edge preservation. In particular, the
value of α in Equation (2) is adaptively computed based on the ratio σI 2 / µI , where µI and
σI 2 are the mean and variance, respectively, of the pixel intensities within R. In homogeneous
regions that do not include edges, the ratio σI 2 / µI is expected to be lower than the ratio com-

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014 49


puted for regions with tissue boundaries. Therefore, the computation of the voxel intensity I VV ( )
is modified to adjust the weights of the pixels within R based on the ratio σI / µI (Huang & 2

Zheng, 2006):


 ∑
M
wk I Pvk ( ),w
( )
I VV =
k =o
M k
=

1
−b σ 2 / µ 
2
(3)
∑ wk d + ae ( I I ) 
k =o
 k 

where a and b are two positive constant parameters set by the user. The ultrasound volumes
constructed using the ASDW method outperform the volumes synthesized using the SDW
method. However, the ASDW method does not completely resolve the blurring problem.
The EPDW method, proposed in this paper, enables effective speckle reduction and tissue
boundary preservation by adaptively adjusting the value of α in Equation (2) based on the
edges of the 2D ultrasound images. The operation of the EPDW method is summarized as fol-
lows. At the beginning of ultrasound volume reconstruction, each acquired 2D ultrasound image
is preprocessed using the speckle reducing anisotropic diffusion (SRAD) filter (Yu, 2002) to
suppress speckle and prepare the image for edge detection. Eight Newton filters (Alemán-Flores,
2001) are applied to the preprocessed ultrasound image to detect the edges. Figure 6 shows an
ultrasound image and the matching edge-detected image obtained using the Newton filters. The
raw (not preprocessed) 2D ultrasound images and the corresponding edge-detected images are
mapped to the coordinate system of the reconstructed ultrasound volume using Equation (1).
Similar to the SDW and ASDW methods, a spherical region R is defines around each voxel in
 
( )
the reconstructed 3D ultrasound volume. The intensity, I VV , of a voxel located at VV is
computed based on the pixels of the raw 2D ultrasound images that are located within the region
R, such that the weighting of these ultrasound pixels is adaptively adjusted based on the edges
located within R. The edge-detected images can be used to perform this adaptive weighting.

( )
Therefore, the computation of I VV can be expressed as:

Figure 6. (a) A 2D ultrasound image and (b) the matching edge-detected image

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
50 International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014


 ∑
M
wk I Pvk ( ),w 
( ) ( )
M
k =o 1
∑ (4)
k
I VV = k
= , µE
= E Pv
(d +ce )
M 2

−d µE
wk k
 k =o
k =o


where I Pvk( ) ( )
and E Pvk are the intensity of a pixel in a 2D ultrasound image located at a

volume coordinate Pvk inside R and the corresponding pixel in the edge-detected image, re-
spectively. µE is the mean value of the pixels of the edge-detected images located within R. The
positive constants c and d are set by the user.
The EPDW method adjusts the weight distribution of the ultrasound pixels within the
neighborhood region R based on the mean value of the edges located inside R. For homogeneous
−d µ
tissue regions that do not include strong edges, the term e E , which corresponds to α in Equa-
tion (2), will have a large value, and hence strong smoothing and effective speckle reduction
−d µ
will be achieved. However, for regions that include tissue boundaries, the term e E will have
a relatively small value, which leads to light smoothing and efficient edge preservation.

Open-source Implementation of the Ultrasound


Volume Reconstruction and Visualization
The computational operations of the proposed freehand 3D ultrasound imaging system were
implemented using both Scilab (Scilab, 2014) and the C programming language, as shown in
Figure 7. Scilab is an open-source, powerful programming environment that incorporates a large
set of algorithms and tools to facilitate the tasks of date processing and analysis in various scien-
tific and engineering applications. The Scilab language supports the dynamic compile and link
of other programming languages, including C, to enable the call and use of external libraries and
functions. In this study, the operations that do not require long processing times, including the
loading of the 2D ultrasound images and transducer’s position information, the temporal calibra-
tion, the preprocessing and edge detection of the ultrasound images, and the transformation of
the pixels’ locations from the coordinate system of the image plane to that of the reconstructed
ultrasound volume, were implemented using Scilab. The computationally-intensive task of
computing the intensity of each voxel in the reconstructed 3D ultrasound volume based on the
pixels of the 2D ultrasound images was implemented using C. In this study, the EPDW method
(Equation (4)) and the ASDW method (Equation (3)) were separately applied to compute the
voxel intensities. The C language was also used to store the reconstructed 3D ultrasound volume
in the legacy vtk file format included in the VTK open-source library (Schroeder et al., 2003).
The VTK library supports the visualization and image processing of 3D computer graphics and
incorporates a large set of data representations, including 3D volumes. The computationally-
intensive operations, which were implemented using C, were run on a graphics processing unit
(GPU) to reduce their run time.
The reconstructed 3D ultrasound volume was visualized using both the MITK open-source
Toolkit (MITK, 2014) and the ParaView open-source application (ParaView, 2014), which en-
able interactive and effective visualization of scientific 3D data volumes.

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014 51

Figure 7. The operations that are implemented using Scilab and the C language.

PERFORMANCE EVALUATION
Accuracy of the Proposed Freehand 3D Ultrasound Imaging System

The accuracy of the freehand 3D ultrasound system is mainly determined by the spatial calibration
process, which estimates the transformation between the coordinate system of the 2D ultrasound
images and that of the electromagnetic sensor attached to the transducer. To evaluate the cali-
bration accuracy, a 3D ultrasound volume was synthesized for the double-N wire phantom that
was used in the spatial calibration process as described in the Materials and Methods section.
During imaging, the ultrasound system was configured to acquired 2D ultrasound images at a
rate of 24 frames per second. The synthesized ultrasound volume was registered to the physical
calibration phantom by setting the coordinate system (V) of the ultrasound volume to match that
of the phantom. The visualization of the ultrasound volume was performed using the ParaView
open-source application. The fiducial registration error (FRE) (Chen et al., 2009; Mercier et al.,
2005) was calculated by computing the root-mean-square difference between the positions of the
two N-wires in the reconstructed ultrasound volume and the matching N-wires in the physical
phantom. The FRE was on the order of sub-millimeter.
A second accuracy-evaluation experiment was carried out by imaging a double cross-wire
phantom. The phantom was composed of two parallel layers, where each layer included two
intersecting wires. The spacing between the two layers was equal to 10 mm. The phantom was
embedded in water. The 3D position of the crossing point between the two wires in first layer
and the position of the crossing point between the two wires in the second layer were known
with respect to the coordinate system of the phantom. The freehand 3D ultrasound system was

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
52 International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014

configured to synthesize a 3D ultrasound volume for the cross-wire phantom such that the coor-
dinate system of the reconstructed ultrasound volume (V) matches that of the phantom. The 3D
positions of the two crossing points in the 3D ultrasound volume were compared with positions
of the matching crossing points in the physical phantom. The difference between the positions
of the crossing points in the 3D ultrasound volume and the physical phantom were less than 1
mm. Figure 8 shows the 3D ultrasound volume synthesized for the double cross-wire phantom.
The visualization of the 3D ultrasound volume was performed using ParaView.

Performance of the Proposed EPDW Interpolation Method

To evaluate the performance of the EPDW method, the freehand imaging system was used to
image the CIRS Model 055 calibration phantom (CIRS, Norfolk, Virginia, USA) that includes
two egg-shape objects with known structure. The imaging only covered one of the two objects.
The transducer was moved during freehand imaging along the short axis of the egg-shape object
to acquire 170 2D ultrasound images at a frame rate of 30 frames per second.
The EPDW method, with parameters of c = 5.0 and d = 8.0 , was used to reconstruct a
3D ultrasound volume for the phantom. The size of the volume was set to 296 × 219 × 132 
voxels with a voxel size of 0.2 × 0.2 × 0.2 mm3. The radius of the spherical neighborhood region
R of the EPDW method was set to 0.5 mm. For comparison, the volume reconstruction was also
carried out using the ASDW interpolation method, where the values of a and b were set to
1000.0 and 2.0, respectively, as suggested in (Huang and Zheng, 2006) and the neighborhood
region R was configured to match the EPDW method. The reconstructed 3D ultrasound volumes
were visualized using the MITK open-source toolkit. The 3D ultrasound volume obtained using
the EPDW method is shown in Figure 9. The volume shows clearly the egg-shape object inside
the phantom. To compare the performance of the EPDW method with the ASDW method, two
slices were extracted from the ultrasound volume synthesized using each method. The first slice,
denoted as in-plane slice, was obtained along the imaging plane of the transducer (i.e. parallel

Figure 8. A 3D ultrasound volume synthesized for the double cross-wire phantom. The ultrasound
volume was visualized using the ParaView application.

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014 53

Figure 9. An ultrasound volume synthesized for an egg-shape object in the CIRS Model 055
phantom using the EPDW method. The volume is visualized using the MITK toolkit.

to the acquired 2D ultrasound images). The second slice, called out-of-plane slice, was obtained
perpendicular to the imaging plane. The in-plane slices of the EPDW and ASDW methods are
shown in Figures 10(a) and 10(b), respectively. Moreover, the out-of-plane slices of the EPDW
and ASDW methods are presented in Figures 10(c) and 10(d), respectively. The edges obtained
using the EPDW method have lower blurring compared with the edges achieved with the ASDW
method. To provide qualitative comparison, the voxel intensities along the row in the out-of-plane
slice of the EPDW method (Figure 10(c)) that is marked by the white dashed line and the match-
ing row in the out-of-plane slice of the ASDW method (Figure 10(d)) are shown in Figures 10(e)
and 10(f), respectively. The two edges of the egg-shape object, which are indicated by the dashed
circles, that are obtained using the EPDW method are sharper than the edges achieved using the
ASDW method. These results indicate that the EPDW method enables effective edge preserva-
tion during 3D ultrasound volume reconstruction.

DISCUSSION
The ongoing research in the field of freehand 3D ultrasound imaging aims to expand the ca-
pabilities of the widely-available, low-cost, conventional 2D ultrasound imaging systems to
support the synthesization of 3D ultrasound volumes. This study contributes to this research by
describing the use of open source software to develop a freehand system that can synthesize 3D
ultrasound volumes with high quality. The open source software tools employed in the current
study, including Silab, VTK, MITK, and ParaView, have been applied to effectively achieve

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
54 International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014

Figure 10. The in-plane slices obtained with the (a) EPDW and (b) ASDW methods. The out-
of-plane slices obtained with the (c) EPDW and (d) ASDW methods. The comparison of voxel
intensities achieved using the (e) EPDW and (f) ASDW methods along the row indicated by the
white dashed lines in the out-of-plane slices shown in figures (c) and (d), respectively.

several components in the proposed system without introducing substantial cost increase or
changing the system workflow. In particular, Silab was employed along with the C language to
implement the computational tasks, the VTK library was used to store the ultrasound volumes,
and both MITK and ParaView were used to visualize the ultrasound volumes.
The accuracy analysis reported in this study showed that the proposed freehand system
can synthesize 3D ultrasound volumes with sub-millimeter accuracy. This improved accuracy
is attributed to the effective spatial calibration that is achieved using the method proposed by
Chen et al. (2009). This calibration method was initially introduced for freehand ultrasound
systems with optical tracking devices, and adapted in the current study to perform the spatial
calibration for the proposed freehand system that uses electromagnetic tracking. However, the
analysis reported in the current study did not consider the effect of the metal objects that might

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014 55

degrade the accuracy of electromagnetic tracking. In fact, analyzing the effect of metal objects
is important for applying the proposed freehand system in the medical procedures that involve
the use of metal objects, such the metal surgical instruments and surgery beds.
One important contribution of the current study is the introduction of the novel EPDW inter-
polation method to enable effective edge preservation during the reconstruction of 3D ultrasound
volumes. In this interpolation method, the intensity of a particular voxel is assigned by comput-
ing the weighted intensity average of the pixels located within a spherical neighborhood region
centered on the voxel. The weight distribution of the pixels within the neighborhood region is
adaptively adjusted based on the edges of the 2D ultrasound images located within the region.
This edge-based adjustment is important to minimize the blurring artifact in the regions that
include tissue boundaries and enabled effective speckle reduction in homogeneous regions. The
experimental analysis of this study has compared the performance of the EPDW method with
the well-established ASDW method. To carry out the comparison, both interpolation methods
were used to reconstruct an ultrasound volume of an egg-shape object in the CIRS Model 055
calibration phantom. The results demonstrate that the ultrasound volume reconstructed using the
EPDW method has lower blurring at the edges. Preserving the edges is crucial in various medi-
cal procedures. For example, the accuracy of segmenting and classifying breast tumors using
ultrasound imaging depend on the ability of outlining the tumor boundary.
The performance evaluation of the 3D reconstruction accuracy and edge preservation of
the proposed freehand 3D ultrasound system were solely based on phantom experiments. In
fact, such phantom analyses are important since the phantoms have known structures, which
facilitate the quantitative analysis. However, the application of the proposed system in medical
procedures, such as 3D breast cancer screening, requires expanding the analysis to evaluate the
capability of the proposed system to synthesize high-quality ultrasound volumes for real tissues.

CONCLUSION AND FUTURE WORK


In this paper, a freehand imaging system is presented to synthesize 3D ultrasound volumes with
high accuracy using electromagnetic tracking. A new interpolation method, called EPDW, is
proposed to map the pixels of the 2D ultrasound images to the voxels of the reconstructed 3D
ultrasound volume. The development of the system is achieved using open-source software. In
particular, the Scilab programming environment and the C language were used to implement
the computational operations. The ultrasound volumes were stored in the legacy vtk file format
provided by the VTK open-source library. The visualization of the ultrasound volumes was
achieved using the MITK open-source toolkit and the ParaView open-source application. The
experimental results demonstrate that the freehand imaging system can synthesize 3D ultrasound
volumes with sub-millimeter accuracy. Moreover, the results show that the ultrasound volume
synthesized using the EPDW method outperforms the volume obtained using a previous inter-
polation method, called the ASDW, in terms of edge preservation.
The future directions include extending the analysis to evaluate the capability of the sys-
tem to synthesize accurate 3D ultrasound volumes for real tissues. This analysis will focus on
evaluating the ability of the EPDW method to improve the performance of tissue segmentation
algorithms by preserving tissue boundaries. The future directions will also examine the effect
of metal objects on the accuracy of the synthesized ultrasound volumes.

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
56 International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014

ACKNOWLEDGMENT
The research contributions of Author1, Author2, and Author4 are supported by the Scientific
Research Support Fund, Jordan. The research contribution of Author 3 is supported by Research
Support Fund, United Arab Emirates University, UAE.

REFERENCES
Afsham, N., Najafi, M., Abolmaesumi, P., & Rohling, R. (2014). A generalized correlation-based model
for out-of-plane motion estimation in freehand ultrasound. IEEE Transactions on Medical Imaging, 33(1),
186–199. doi:10.1109/TMI.2013.2283969 PMID:24108710
Alemán-Flores, M., Álvarez, L., & Moreno, R. (2001). Modified Newton filters for edge orientation estima-
tion and shape representation. Proceedings of the International Conference on Signal Processing, Pattern
Recognition and Applications (pp. 27-32). Rhodes, Greece.
Bax, J., Cool, D., Gardi, L., Knight, K., Smith, D., Montreuil, J., & Fenster, A. et  al. (2008). Me-
chanically assisted 3D ultrasound guided prostate biopsy system. Medical Physics, 35(12), 5397–5410.
doi:10.1118/1.3002415 PMID:19175099
Boisvert, J., Gobbi, D., Vikal, S., Rohling, R., Fichtinger, G., & Abolmaesumi, P. (2008). An open-source
solution for interactive acquisition, processing and transfer of interventional ultrasound images. Proceed-
ings of Workshop on Systems and Architectures for Computer Assisted Interventions (pp. 1-8), New York
City, USA. Retrieved from http://hdl.handle.net/10380/1459
Chen, T. K., Thurston, A. D., Ellis, R. E., & Abolmaesumi, P. (2009). A real-time freehand ultrasound cali-
bration system with automatic accuracy feedback and control. Ultrasound in Medicine & Biology, 35(1),
79–93. doi:10.1016/j.ultrasmedbio.2008.07.004 PMID:18829150
Daoud, M. I., Abolmaesumi, P., You, W., Salcudean, S. E., & Rohling, R. N. (2011). Signature-based
algorithm for improved needle localization in ultrasound images: A feasibility study. Proceedings of the
IEEE International Ultrasonics Symposium (pp. 1575- 1578). Orlando, FL, USA
Fenster, A., Parraga, G., & Bax, J. (2011). Three-dimensional ultrasound scanning. Interface Focus, 1(4),
503–519. doi:10.1098/rsfs.2011.0019 PMID:22866228
Gee, A. H., Housden, R. J., Hassenpflug, P., Treece, G. M., & Prager, R. W. (2006). Sensorless freehand 3-D
ultrasound in real tissue: Speckle decorrelation without fully developed speckle. Medical Image Analysis,
10(2), 137–149. doi:10.1016/j.media.2005.08.001 PMID:16143560
Geiser, E. A., Christie, L. G., Conetta, D. A., Conti, C. R., & Gossman, G. S. (1982). A mechanical arm for
spatial registration of two-dimensional echocardiographic sections. Catheterization and Cardiovascular
Diagnosis, 8(1), 89–101. doi:10.1002/ccd.1810080114 PMID:7060123
Gooding, M. J., Kennedy, S., & Noble, J. A. (2008). Volume segmentation and reconstruction from freehand
three-dimensional ultrasound data with application to ovarian follicle measurement. Ultrasound in Medicine
& Biology, 34(2), 183–195. doi:10.1016/j.ultrasmedbio.2007.07.023 PMID:17935866
Huang, Q. H., & Zheng, Y. P. (2006). An adaptive squared-distance-weighted interpolation for volume
reconstruction in 3D freehand ultrasound. Ultrasonics, 44, e73–e77. doi:10.1016/j.ultras.2006.06.040
PMID:16844174
Huang, Q. H., & Zheng, Y. P. (2008). Volume reconstruction of freehand three-dimensional ultrasound using
median filters. Ultrasonics, 48(3), 182–192. doi:10.1016/j.ultras.2007.11.005 PMID:18206200
Huang, Q. H., Zheng, Y. P., Lu, M. H., & Chi, Z. R. (2005). Development of a portable 3D ultrasound
imaging system for musculoskeletal tissues. Ultrasonics, 43(3), 153–163. doi:10.1016/j.ultras.2004.05.003
PMID:15556650

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Open Source Software and Processes, 5(3), 39-57, July-September 2014 57

Ince, D. C., Hatton, L., & Graham-Cumming, J. (2012). The case for open computer programs. Nature,
482(7386), 485–488. doi:10.1038/nature10836 PMID:22358837
Liao, Y.-Y., Tsui, P.-H., Li, C.-H., Chang, K.-J., Kuo, W.-H., Chang, C.-C., & Yeh, C.-K. (2011). Classification
of scattering media within benign and malignant breast tumors based on ultrasound texture-feature-based
and Nakagami-parameter images. Medical Physics, 38(4), 2198–2207.
Mercier, L., Langø, T., Lindseth, F., & Collins, L. D. (2005). A review of calibration techniques for freehand
3-D ultrasound systems. Ultrasound in Medicine & Biology, 31(2), 143–165. doi:10.1016/j.ultrasmed-
bio.2004.11.001 PMID:15708453
MITK. (2014). The Medical Imaging Interaction Toolkit (MITK). Retrieved from http://www.mitk.org
Oralkan, O., Ergun, A. S., Cheng, C. H., Johnson, J. A., Karaman, M., Lee, T. H., & Khuri-Yakub, B. T.
(2003). Volumetric ultrasound imaging using 2-D CMUT arrays. IEEE Transactions on Ultrasonics, Fer-
roelectrics, and Frequency Control, 50(11), 1581–1594. doi:10.1109/TUFFC.2003.1251142 PMID:14682642
ParaView. (2014). ParaView. Retrieved from http://www.paraview.org
Prager, P. W., Gee, A., & Berman, L. (1999). Stradx: Real-time acquisition and visualization of freehand
three-dimensional ultrasound. Medical Image Analysis, 3(2), 129–140. doi:10.1016/S1361-8415(99)80003-
6 PMID:10711995
Rohling, R., Gee, A., & Berman, L. (1997). Three-dimensional spatial compounding of ultrasound images.
Medical Image Analysis, 1(3), 177–193. doi:10.1016/S1361-8415(97)85009-8 PMID:9873905
Rohling, R. N., Gee, A. H., & Berman, L. (1999). A comparison of freehand three-dimensional ultrasound
reconstruction techniques. Medical Image Analysis, 3(4), 339–359. doi:10.1016/S1361-8415(99)80028-0
PMID:10709700
Scheipers, U., Koptenko, S., Remlinger, R., Falco, T., & Lachaine, M. (2010). 3-D ultrasound volume re-
construction using the direct frame interpolation method. IEEE Transactions on Ultrasonics, Ferroelectrics,
and Frequency Control, 57(11), 2460–2470. doi:10.1109/TUFFC.2010.1712 PMID:21041133
Schroeder, W., Martin, K., & Lorensen, B. (2003). The Visualization Toolkit (3rd ed.). Clifton Park, New
York, United States: Kitware Inc.
Scilab. (2014). Scilab Enterprises. Retrieved from http://www.scilab.org
Treece, G. M., Gee, A. H., Prager, R. W., Cash, C. J. C., & Berman, L. (2003). High-definition freehand
3-D ultrasound. Ultrasound in Medicine & Biology, 29(4), 529–546. doi:10.1016/S0301-5629(02)00735-4
PMID:12749923
Unsgard, G. (2009). Ultrasound-Guided Neurosurgery. In M. Sindou (Ed.), Practical Handbook of Neuro-
surgery (pp. 407–425). Vienna, Austria: Springer Vienna. doi:10.1007/978-3-211-84820-3_55
Wen, T., Zhu, Q., Qin, W., Li, L., Yang, F., Xie, Y., & Gu, J. (2013). An accurate and effective FMM-based
approach for freehand 3D ultrasound reconstruction. Biomedical Signal Processing and Control, 8(6), 645–656.
Yu, Y., & Acton, S. T. (2002). Speckle reducing anisotropic diffusion. IEEE Transactions on Image Pro-
cessing, 11(11), 1260–1270. doi:10.1109/TIP.2002.804276 PMID:18249696
Zhang, W. Y., Rohling, R. N., & Pai, D. K. (2004). Surface extraction with a three-dimensional freehand
ultrasound system. Ultrasound in Medicine & Biology, 30(11), 1461–1473. doi:10.1016/j.ultrasmed-
bio.2004.08.020 PMID:15588957

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

S-ar putea să vă placă și