Addressing flux suppression, radio frequency interference, and selection of optimal solution intervals during radio interferometric calibration
- Authors: Sob, Ulrich Armel Mbou
- Date: 2020
- Subjects: CubiCal (Software) , Radio -- Interference , Imaging systems in astronomy , Algorithms , Astronomical instruments -- Calibration , Astronomy -- Data processing
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/147714 , vital:38663
- Description: The forthcoming Square Kilometre Array is expected to provide answers to some of the most intriguing questions about our Universe. However, as it is already noticeable from MeerKAT and other precursors, the amounts of data produced by these new instruments are significantly challenging to calibrate and image. Calibration of radio interferometric data is usually biased by incomplete sky models and radio frequency interference (RFI) resulting in calibration artefacts that limit the dynamic range and image fidelity of the resulting images. One of the most noticeable of these artefacts is the formation of spurious sources which causes suppression of real emissions. Fortunately, it has been shown that calibration algorithms employing heavy-tailed likelihood functions are less susceptible to this due to their robustness against outliers. Leveraging on recent developments in the field of complex optimisation, we implement a robust calibration algorithm using a Student’s t likelihood function and Wirtinger derivatives. The new algorithm, dubbed the robust solver, is incorporated as a subroutine into the newly released calibration software package CubiCal. We perform statistical analysis on the distribution of visibilities and provide an insight into the functioning of the robust solver and describe different scenarios where it will improve calibration. We use simulations to show that the robust solver effectively reduces the amount of flux suppressed from unmodelled sources both in direction independent and direction dependent calibration. Furthermore, the robust solver is shown to successfully mitigate the effects of low-level RFI when applied to a simulated and a real VLA dataset. Finally, we demonstrate that there are close links between the amount of flux suppressed from sources, the effects of the RFI and the employed solution interval during radio interferometric calibration. Hence, we investigate the effects of solution intervals and the different factors to consider in order to select adequate solution intervals. Furthermore, we propose a practical brute force method for selecting optimal solution intervals. The proposed method is successfully applied to a VLA dataset.
- Full Text:
- Date Issued: 2020
- Authors: Sob, Ulrich Armel Mbou
- Date: 2020
- Subjects: CubiCal (Software) , Radio -- Interference , Imaging systems in astronomy , Algorithms , Astronomical instruments -- Calibration , Astronomy -- Data processing
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/147714 , vital:38663
- Description: The forthcoming Square Kilometre Array is expected to provide answers to some of the most intriguing questions about our Universe. However, as it is already noticeable from MeerKAT and other precursors, the amounts of data produced by these new instruments are significantly challenging to calibrate and image. Calibration of radio interferometric data is usually biased by incomplete sky models and radio frequency interference (RFI) resulting in calibration artefacts that limit the dynamic range and image fidelity of the resulting images. One of the most noticeable of these artefacts is the formation of spurious sources which causes suppression of real emissions. Fortunately, it has been shown that calibration algorithms employing heavy-tailed likelihood functions are less susceptible to this due to their robustness against outliers. Leveraging on recent developments in the field of complex optimisation, we implement a robust calibration algorithm using a Student’s t likelihood function and Wirtinger derivatives. The new algorithm, dubbed the robust solver, is incorporated as a subroutine into the newly released calibration software package CubiCal. We perform statistical analysis on the distribution of visibilities and provide an insight into the functioning of the robust solver and describe different scenarios where it will improve calibration. We use simulations to show that the robust solver effectively reduces the amount of flux suppressed from unmodelled sources both in direction independent and direction dependent calibration. Furthermore, the robust solver is shown to successfully mitigate the effects of low-level RFI when applied to a simulated and a real VLA dataset. Finally, we demonstrate that there are close links between the amount of flux suppressed from sources, the effects of the RFI and the employed solution interval during radio interferometric calibration. Hence, we investigate the effects of solution intervals and the different factors to consider in order to select adequate solution intervals. Furthermore, we propose a practical brute force method for selecting optimal solution intervals. The proposed method is successfully applied to a VLA dataset.
- Full Text:
- Date Issued: 2020
A pilot wide-field VLBI survey of the GOODS-North field
- Authors: Akoto-Danso, Alexander
- Date: 2019
- Subjects: Radio astronomy , Very long baseline interferometry , Radio interometers , Imaging systems in astronomy , Hubble Space Telescope (Spacecraft) -- Observations
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/72296 , vital:30027
- Description: Very Long Baseline Interferometry (VLBI) has significant advantages in disentangling active galactic nuclei (AGN) from star formation, particularly at intermediate to high-redshift due to its high angular resolution and insensitivity to dust. Surveys using VLBI arrays are only just becoming practical over wide areas with numerous developments and innovations (such as multi-phase centre techniques) in observation and data analysis techniques. However, fully automated pipelines for VLBI data analysis are based on old software packages and are unable to incorporate new calibration and imaging algorithms. In this work, the researcher developed a pipeline for VLBI data analysis which integrates a recent wide-field imaging algorithm, RFI excision, and a purpose-built source finding algorithm specifically developed for the 64kx64k wide-field VLBI images. The researcher used this novel pipeline to process 6% (~ 9 arcmin2 of the total 160 arcmin2) of the data from the CANDELS GOODS- North extragalactic field at 1.6 GHz. The milli-arcsec scale images have an average rms of a ~ 10 uJy/beam. Forty four (44) candidate sources were detected, most of which are at sub-mJy flux densities, having brightness temperatures and luminosities of >5x105 K and >6x1021 W Hz-1 respectively. This work demonstrates that automated post-processing pipelines for wide-field, uniform sensitivity VLBI surveys are feasible and indeed made more efficient with new software, wide-field imaging algorithms and more purpose-built source- finders. This broadens the discovery space for future wide-field surveys with upcoming arrays such as the African VLBI Network (AVN), MeerKAT and the Square Kilometre Array (SKA).
- Full Text:
- Date Issued: 2019
- Authors: Akoto-Danso, Alexander
- Date: 2019
- Subjects: Radio astronomy , Very long baseline interferometry , Radio interometers , Imaging systems in astronomy , Hubble Space Telescope (Spacecraft) -- Observations
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/72296 , vital:30027
- Description: Very Long Baseline Interferometry (VLBI) has significant advantages in disentangling active galactic nuclei (AGN) from star formation, particularly at intermediate to high-redshift due to its high angular resolution and insensitivity to dust. Surveys using VLBI arrays are only just becoming practical over wide areas with numerous developments and innovations (such as multi-phase centre techniques) in observation and data analysis techniques. However, fully automated pipelines for VLBI data analysis are based on old software packages and are unable to incorporate new calibration and imaging algorithms. In this work, the researcher developed a pipeline for VLBI data analysis which integrates a recent wide-field imaging algorithm, RFI excision, and a purpose-built source finding algorithm specifically developed for the 64kx64k wide-field VLBI images. The researcher used this novel pipeline to process 6% (~ 9 arcmin2 of the total 160 arcmin2) of the data from the CANDELS GOODS- North extragalactic field at 1.6 GHz. The milli-arcsec scale images have an average rms of a ~ 10 uJy/beam. Forty four (44) candidate sources were detected, most of which are at sub-mJy flux densities, having brightness temperatures and luminosities of >5x105 K and >6x1021 W Hz-1 respectively. This work demonstrates that automated post-processing pipelines for wide-field, uniform sensitivity VLBI surveys are feasible and indeed made more efficient with new software, wide-field imaging algorithms and more purpose-built source- finders. This broadens the discovery space for future wide-field surveys with upcoming arrays such as the African VLBI Network (AVN), MeerKAT and the Square Kilometre Array (SKA).
- Full Text:
- Date Issued: 2019
Modelling storm-time TEC changes using linear and non-linear techniques
- Authors: Uwamahoro, Jean Claude
- Date: 2019
- Subjects: Magnetic storms , Astronomy -- Computer programs , Imaging systems in astronomy , Ionospheric storms , Electrons -- Measurement , Magnetosphere -- Observations
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/92908 , vital:30762
- Description: Statistical models based on empirical orthogonal functions (EOF) analysis and non-linear regression analysis (NLRA) were developed for the purpose of estimating the ionospheric total electron content (TEC) during geomagnetic storms. The well-known least squares method (LSM) and Metropolis-Hastings algorithm (MHA) were used as optimization techniques to determine the unknown coefficients of the developed analytical expressions. Artificial Neural Networks (ANNs), the International Reference Ionosphere (IRI) model, and the Multi-Instrument Data Analysis System (MIDAS) tomographic inversion algorithm were also applied to storm-time TEC modelling/reconstruction for various latitudes of the African sector and surrounding areas. This work presents some of the first statistical modeling of the mid-latitude and low-latitude ionosphere during geomagnetic storms that includes solar, geomagnetic and neutral wind drivers.Development and validation of the empirical models were based on storm-time TEC data derived from the global positioning system (GPS) measurements over ground receivers within Africa and surrounding areas. The storm criterion applied was Dst 6 −50 nT and/or Kp > 4. The performance evaluation of MIDAS compared with ANNs to reconstruct storm-time TEC over the African low- and mid-latitude regions showed that MIDAS and ANNs provide comparable results. Their respective mean absolute error (MAE) values were 4.81 and 4.18 TECU. The ANN model was, however, found to perform 24.37 % better than MIDAS at estimating storm-time TEC for low latitudes, while MIDAS is 13.44 % more accurate than ANN for the mid-latitudes. When their performances are compared with the IRI model, both MIDAS and ANN model were found to provide more accurate storm-time TEC reconstructions for the African low- and mid-latitude regions. A comparative study of the performances of EOF, NLRA, ANN, and IRI models to estimate TEC during geomagnetic storm conditions over various latitudes showed that the ANN model is about 10 %, 26 %, and 58 % more accurate than EOF, NLRA, and IRI models, respectively, while EOF was found to perform 15 %, and 44 % better than NLRA and IRI, respectively. It was further found that the NLRA model is 25 % more accurate than the IRI model. We have also investigated for the first time, the role of meridional neutral winds (from the Horizontal Wind Model) to storm-time TEC modelling in the low latitude, northern and southern hemisphere mid-latitude regions of the African sector, based on ANN models. Statistics have shown that the inclusion of the meridional wind velocity in TEC modelling during geomagnetic storms leads to percentage improvements of about 5 % for the low latitude, 10 % and 5 % for the northern and southern hemisphere mid-latitude regions, respectively. High-latitude storm-induced winds and the inter-hemispheric blows of the meridional winds from summer to winter hemisphere have been suggested to be associated with these improvements.
- Full Text:
- Date Issued: 2019
- Authors: Uwamahoro, Jean Claude
- Date: 2019
- Subjects: Magnetic storms , Astronomy -- Computer programs , Imaging systems in astronomy , Ionospheric storms , Electrons -- Measurement , Magnetosphere -- Observations
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/92908 , vital:30762
- Description: Statistical models based on empirical orthogonal functions (EOF) analysis and non-linear regression analysis (NLRA) were developed for the purpose of estimating the ionospheric total electron content (TEC) during geomagnetic storms. The well-known least squares method (LSM) and Metropolis-Hastings algorithm (MHA) were used as optimization techniques to determine the unknown coefficients of the developed analytical expressions. Artificial Neural Networks (ANNs), the International Reference Ionosphere (IRI) model, and the Multi-Instrument Data Analysis System (MIDAS) tomographic inversion algorithm were also applied to storm-time TEC modelling/reconstruction for various latitudes of the African sector and surrounding areas. This work presents some of the first statistical modeling of the mid-latitude and low-latitude ionosphere during geomagnetic storms that includes solar, geomagnetic and neutral wind drivers.Development and validation of the empirical models were based on storm-time TEC data derived from the global positioning system (GPS) measurements over ground receivers within Africa and surrounding areas. The storm criterion applied was Dst 6 −50 nT and/or Kp > 4. The performance evaluation of MIDAS compared with ANNs to reconstruct storm-time TEC over the African low- and mid-latitude regions showed that MIDAS and ANNs provide comparable results. Their respective mean absolute error (MAE) values were 4.81 and 4.18 TECU. The ANN model was, however, found to perform 24.37 % better than MIDAS at estimating storm-time TEC for low latitudes, while MIDAS is 13.44 % more accurate than ANN for the mid-latitudes. When their performances are compared with the IRI model, both MIDAS and ANN model were found to provide more accurate storm-time TEC reconstructions for the African low- and mid-latitude regions. A comparative study of the performances of EOF, NLRA, ANN, and IRI models to estimate TEC during geomagnetic storm conditions over various latitudes showed that the ANN model is about 10 %, 26 %, and 58 % more accurate than EOF, NLRA, and IRI models, respectively, while EOF was found to perform 15 %, and 44 % better than NLRA and IRI, respectively. It was further found that the NLRA model is 25 % more accurate than the IRI model. We have also investigated for the first time, the role of meridional neutral winds (from the Horizontal Wind Model) to storm-time TEC modelling in the low latitude, northern and southern hemisphere mid-latitude regions of the African sector, based on ANN models. Statistics have shown that the inclusion of the meridional wind velocity in TEC modelling during geomagnetic storms leads to percentage improvements of about 5 % for the low latitude, 10 % and 5 % for the northern and southern hemisphere mid-latitude regions, respectively. High-latitude storm-induced winds and the inter-hemispheric blows of the meridional winds from summer to winter hemisphere have been suggested to be associated with these improvements.
- Full Text:
- Date Issued: 2019
Statistical Analysis of the Radio-Interferometric Measurement Equation, a derived adaptive weighting scheme, and applications to LOFAR-VLBI observation of the Extended Groth Strip
- Authors: Bonnassieux, Etienne
- Date: 2019
- Subjects: Radio astronomy , Astrophysics , Astrophysics -- Instruments -- Calibration , Imaging systems in astronomy , Radio interferometers , Radio telescopes , Astronomy -- Observations
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/93789 , vital:30942
- Description: J.R.R Tolkien wrote, in his Mythopoeia, that “He sees no stars who does not see them first, of living silver made that sudden burst, to flame like flowers beneath the ancient song”. In his defense of myth-making, he formulates the argument that the attribution of meaning is an act of creation - that “trees are not ‘trees’ until so named and seen” - and that this capacity for creation defines the human creature. The scientific endeavour, in this context, can be understood as a social expression of a fundamental feature of humanity, and from this endeavour flows much understanding. This thesis, one thread among many, focuses on the study of astronomical objects as seen by the radio waves they emit. What are radio waves? Electromagnetic waves were theorised by James Clerk Maxwell (Maxwell 1864) in his great theoretical contribution to modern physics, their speed matching the speed of light as measured by Ole Christensen R0mer and, later, James Bradley. It was not until Heinrich Rudolf Hertz’s 1887 experiment that these waves were measured in a laboratory, leading to the dawn of radio communications - and, later, radio astronomy. The link between radio waves and light was one of association: light is known to behave as a wave (Young double-slit experiment), with the same propagation speed as electromagnetic radiation. Light “proper” is also known to exist beyond the optical regime: Herschel’s experiment shows that when diffracted through a prism, sunlight warms even those parts of a desk which are not observed to be lit (first evidence of infrared light). The link between optical light and unseen electromagnetic radiation is then an easy step to make, and one confirmed through countless technological applications (e.g. optical fiber to name but one). And as soon as this link is established, a question immediately comes to the mind of the astronomer: what does the sky, our Universe, look like to the radio “eye”? Radio astronomy has a short but storied history: from Karl Jansky’s serendipitous observation of the centre of the Milky Way, which outshines our Sun in the radio regime, in 1933, to Grote Reber’s hand-built back-yard radio antenna in 1937, which successfully detected radio emission from the Milky Way itself, to such monumental projects as the Square Kilometer Array and its multiple pathfinders, it has led to countless discoveries and the opening of a truly new window on the Universe. The work presented in this thesis is a contribution to this discipline - the culmination of three years of study, which is a rather short time to get a firm grasp of radio interferometry both in theory and in practice. The need for robust, automated methods - which are improving daily, thanks to the tireless labour of the scientists in the field - is becoming ever stronger as the SKA approaches, looming large on the horizon; but even today, in the precursor era of LOFAR, MeerKAT and other pathfinders, it is keenly felt. When I started my doctorate, the sheer scale of the task at hand felt overwhelming - to actually be able to contribute to its resolution seemed daunting indeed! Thankfully, as the saying goes, no society sets for itself material goals which it cannot achieve. This thesis took place at an exciting time for radio interferometry: at the start of my doctorate, the LOFAR international stations were - to my knowledge - only beginning to be used, and even then, only tentatively; MeerKAT had not yet shown its first light; the techniques used throughout my work were still being developed. At the time of writing, great strides have been made. One of the greatest technical challenges of LOFAR - imaging using the international stations - is starting to become reality. This technical challenge is the key problem that this thesis set out to address. While we only achieved partial success so far, it is a testament to the difficulty of the task that it is not yet truly resolved. One of the major results of this thesis is a model of a bright resolved source near a famous extragalactic field: properly modeling this source not only allows the use of international LOFAR stations, but also grants deeper access to the extragalactic field itself, which is otherwise polluted by the 3C source’s sidelobes. This result was only achieved thanks to the other major result of this thesis: the development of a theoretical framework with which to better understand the effect of calibration errors on images made from interferometric data, and an algorithm to strongly mitigate them. The structure of this manuscript is as follows: we begin with an introduction to radio interferometry, LOFAR, and the emission mechanisms which dominate for our field of interest. These introductions are primarily intended to give a brief overview of the technical aspects of the data reduced in this thesis. We follow with an overview of the Measurement Equation formalism, which underpins our theoretical work. This is the keystone of this thesis. We then show the theoretical work that was developed as part of the research work done during the doctorate - which was published in Astronomy & Astrophysics. Its practical application - a quality-based weighting scheme - is used throughout our data reduction. This data reduction is the next topic of this thesis: we contextualise the scientific interest of the data we reduce, and explain both the methods and the results we achieve.
- Full Text:
- Date Issued: 2019
- Authors: Bonnassieux, Etienne
- Date: 2019
- Subjects: Radio astronomy , Astrophysics , Astrophysics -- Instruments -- Calibration , Imaging systems in astronomy , Radio interferometers , Radio telescopes , Astronomy -- Observations
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/93789 , vital:30942
- Description: J.R.R Tolkien wrote, in his Mythopoeia, that “He sees no stars who does not see them first, of living silver made that sudden burst, to flame like flowers beneath the ancient song”. In his defense of myth-making, he formulates the argument that the attribution of meaning is an act of creation - that “trees are not ‘trees’ until so named and seen” - and that this capacity for creation defines the human creature. The scientific endeavour, in this context, can be understood as a social expression of a fundamental feature of humanity, and from this endeavour flows much understanding. This thesis, one thread among many, focuses on the study of astronomical objects as seen by the radio waves they emit. What are radio waves? Electromagnetic waves were theorised by James Clerk Maxwell (Maxwell 1864) in his great theoretical contribution to modern physics, their speed matching the speed of light as measured by Ole Christensen R0mer and, later, James Bradley. It was not until Heinrich Rudolf Hertz’s 1887 experiment that these waves were measured in a laboratory, leading to the dawn of radio communications - and, later, radio astronomy. The link between radio waves and light was one of association: light is known to behave as a wave (Young double-slit experiment), with the same propagation speed as electromagnetic radiation. Light “proper” is also known to exist beyond the optical regime: Herschel’s experiment shows that when diffracted through a prism, sunlight warms even those parts of a desk which are not observed to be lit (first evidence of infrared light). The link between optical light and unseen electromagnetic radiation is then an easy step to make, and one confirmed through countless technological applications (e.g. optical fiber to name but one). And as soon as this link is established, a question immediately comes to the mind of the astronomer: what does the sky, our Universe, look like to the radio “eye”? Radio astronomy has a short but storied history: from Karl Jansky’s serendipitous observation of the centre of the Milky Way, which outshines our Sun in the radio regime, in 1933, to Grote Reber’s hand-built back-yard radio antenna in 1937, which successfully detected radio emission from the Milky Way itself, to such monumental projects as the Square Kilometer Array and its multiple pathfinders, it has led to countless discoveries and the opening of a truly new window on the Universe. The work presented in this thesis is a contribution to this discipline - the culmination of three years of study, which is a rather short time to get a firm grasp of radio interferometry both in theory and in practice. The need for robust, automated methods - which are improving daily, thanks to the tireless labour of the scientists in the field - is becoming ever stronger as the SKA approaches, looming large on the horizon; but even today, in the precursor era of LOFAR, MeerKAT and other pathfinders, it is keenly felt. When I started my doctorate, the sheer scale of the task at hand felt overwhelming - to actually be able to contribute to its resolution seemed daunting indeed! Thankfully, as the saying goes, no society sets for itself material goals which it cannot achieve. This thesis took place at an exciting time for radio interferometry: at the start of my doctorate, the LOFAR international stations were - to my knowledge - only beginning to be used, and even then, only tentatively; MeerKAT had not yet shown its first light; the techniques used throughout my work were still being developed. At the time of writing, great strides have been made. One of the greatest technical challenges of LOFAR - imaging using the international stations - is starting to become reality. This technical challenge is the key problem that this thesis set out to address. While we only achieved partial success so far, it is a testament to the difficulty of the task that it is not yet truly resolved. One of the major results of this thesis is a model of a bright resolved source near a famous extragalactic field: properly modeling this source not only allows the use of international LOFAR stations, but also grants deeper access to the extragalactic field itself, which is otherwise polluted by the 3C source’s sidelobes. This result was only achieved thanks to the other major result of this thesis: the development of a theoretical framework with which to better understand the effect of calibration errors on images made from interferometric data, and an algorithm to strongly mitigate them. The structure of this manuscript is as follows: we begin with an introduction to radio interferometry, LOFAR, and the emission mechanisms which dominate for our field of interest. These introductions are primarily intended to give a brief overview of the technical aspects of the data reduced in this thesis. We follow with an overview of the Measurement Equation formalism, which underpins our theoretical work. This is the keystone of this thesis. We then show the theoretical work that was developed as part of the research work done during the doctorate - which was published in Astronomy & Astrophysics. Its practical application - a quality-based weighting scheme - is used throughout our data reduction. This data reduction is the next topic of this thesis: we contextualise the scientific interest of the data we reduce, and explain both the methods and the results we achieve.
- Full Text:
- Date Issued: 2019
PyMORESANE: A Pythonic and CUDA-accelerated implementation of the MORESANE deconvolution algorithm
- Authors: Kenyon, Jonathan
- Date: 2015
- Subjects: Radio astronomy , Imaging systems in astronomy , MOdel REconstruction by Synthesis-ANalysis Estimators (MORESANE)
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:5563 , http://hdl.handle.net/10962/d1020098
- Description: The inadequacies of the current generation of deconvolution algorithms are rapidly becoming apparent as new, more sensitive radio interferometers are constructed. In light of these inadequacies, there is renewed interest in the field of deconvolution. Many new algorithms are being developed using the mathematical framework of compressed sensing. One such technique, MORESANE, has recently been shown to be a powerful tool for the recovery of faint difuse emission from synthetic and simulated data. However, the original implementation is not well-suited to large problem sizes due to its computational complexity. Additionally, its use of proprietary software prevents it from being freely distributed and used. This has motivated the development of a freely available Python implementation, PyMORESANE. This thesis describes the implementation of PyMORESANE as well as its subsequent augmentation with MPU and GPGPU code. These additions accelerate the algorithm and thus make it competitive with its legacy counterparts. The acceleration of the algorithm is verified by means of benchmarking tests for varying image size and complexity. Additionally, PyMORESANE is shown to work not only on synthetic data, but on real observational data. This verification means that the MORESANE algorithm, and consequently the PyMORESANE implementation, can be added to the current arsenal of deconvolution tools.
- Full Text:
- Date Issued: 2015
- Authors: Kenyon, Jonathan
- Date: 2015
- Subjects: Radio astronomy , Imaging systems in astronomy , MOdel REconstruction by Synthesis-ANalysis Estimators (MORESANE)
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:5563 , http://hdl.handle.net/10962/d1020098
- Description: The inadequacies of the current generation of deconvolution algorithms are rapidly becoming apparent as new, more sensitive radio interferometers are constructed. In light of these inadequacies, there is renewed interest in the field of deconvolution. Many new algorithms are being developed using the mathematical framework of compressed sensing. One such technique, MORESANE, has recently been shown to be a powerful tool for the recovery of faint difuse emission from synthetic and simulated data. However, the original implementation is not well-suited to large problem sizes due to its computational complexity. Additionally, its use of proprietary software prevents it from being freely distributed and used. This has motivated the development of a freely available Python implementation, PyMORESANE. This thesis describes the implementation of PyMORESANE as well as its subsequent augmentation with MPU and GPGPU code. These additions accelerate the algorithm and thus make it competitive with its legacy counterparts. The acceleration of the algorithm is verified by means of benchmarking tests for varying image size and complexity. Additionally, PyMORESANE is shown to work not only on synthetic data, but on real observational data. This verification means that the MORESANE algorithm, and consequently the PyMORESANE implementation, can be added to the current arsenal of deconvolution tools.
- Full Text:
- Date Issued: 2015
- «
- ‹
- 1
- ›
- »