Abstract:
In the field of optical interferometry, two-dimensional projections of light
interference patterns are often analysed in order to obtain measurements of
interest. Such interference patterns, or interferograms, contain phase
information which is inherently wrapped onto the range -t to it.
Phase unwrapping is the processes of the restoration of the unknown
multiple of 2ir, and therefore plays a major role in the overall process of
interferogram analysis.
Unwrapping phase information correctly becomes a challenging process in
the presence of noise. This is particularly the case for speckle
interferograms, which are noisy by nature.
Many phase unwrapping algorithms have been devised by workers in the
field, in order to achieve better noise rejection and improve the computational
performance.
This thesis focuses on the computational efficiency aspect, and picks as a
starting point an existing phase unwrapping algorithm which has been shown
to have inherent noise immunity. This is, namely, the tile-based phase
unwrapping method, which attains its enhanced noise immunity through the
application of the minimum spanning tree concept from graph theory.
The thesis examines the problem of finding a minimum spanning tree, for this
particular application, from a graph theory perspective, and shows that a
more efficient class of minimum spanning tree algorithms can be applied to
the problem.
The thesis then goes on to show how a novel algorithm can be used to
significantly reduce the size of the minimum spanning tree problem in an
efficient manner.