Extragalactic Astronomy
print


Breadcrumb Navigation


Content

Master thesis projects

Cosmology, gravitational lensing, and large scale structure

Contact: Stella Seitz (stella@usm.lmu.de), Arno Riffeser (arri@usm.lmu.de), Roberto Saglia (saglia@usm.lmu.de), OPINAS Group ( http://www.mpe.mpg.de/1761897/Master-_und_Doktorarbeiten )

  • Theoretical and computational weak lensing cosmology

    Contact: Anik Halder (ahalder@usm.lmu.de), Laurence Gong (lgong@usm.lmu.de), Stella Seitz (stella@usm.lmu.de)


    One of the goals of modern cosmology is to understand how the cosmic web of our Universe came to be. A key to answering this question requires an accurate determination of the Universe’s matter and energy content— more precisely, quantifying the amount of dark matter and dark energy; two elusive contents which dominate the total energy budget of our Universe but the nature of which remains a mystery to modern physics. A primary tool for quantifying the energy content is through the weak gravitational lensing effect which involves the statistical study of coherent distortions imprinted in the shapes of far-away (background) galaxies by the gravitational lensing effect of the foreground matter distribution in the Universe. Statistical analysis of the observed weak lensing data facilitates the inference of various cosmological model parameters and allows us to put constraints on the amount of dark matter and dark energy in the Universe. The goal of these projects is to make various investigations into both conventional and higher-order statistics of the weak lensing field — the latter holding the potential to improve upon cosmological parameter constraints obtained from the conventional 2-point (lower-order) statistical techniques. In our group, we have been developing a novel and promising higher-order weak lensing statistic called the ‘integrated 3-point correlation function’ (integrated 3PCF). A lot of interesting milestones remain to be achieved for the robust development and implementation of this statistic. We are therefore looking for highly motivated students with strong mathematical and scientific computing backgrounds (prior programming experience in Python would be an asset) to perform theoretical modelling and computational tasks in this context. Knowledge of cosmology and gravitational lensing would be highly beneficial.

    The master projects in the context of the exploration of the integrated 3PCF will look into one or a combination of the following topics (depending on the progress and interests of the student):

    - A key ingredient in the estimation of the integrated 3PCF from cosmic shear data is the estimation of the so called ‘aperture mass’ statistic. As the name suggests this statistic is a probe of the projected line of sight mass weak lensing ‘mass’ within a given aperture in one’s field of view. Practically, this is measured through a weighted sum of the tangential shear signal of background source galaxies around the centre of the aperture. Current techniques to measure this aperture mass statistic is slow which in turn results in a slow estimation of the integrated 3PCF. For cosmological analysis with the integrated 3PCF this is a major challenge. In this project, the student will need to implement a fast and optimized way of measuring this aperture mass statistic with parallelisation techniques (and possibly acceleration with GPU) so as to make possible a robust and fast way of measuring the integrated 3PCF.

    - A major systematic effect that affects weak lensing statistics is the so-called ‘intrinsic alignment’ of galaxies. If this effect is not accounted for in one’s model, it can lead to severe biases in the inferred cosmological parameters. In this project, we will explore the effect of intrinsic alignment on the integrated 3PCF. In order to do this, we will use weak lensing simulations infused with intrinsic alignment to measure the integrated 3PCF and then validate our model for the integrated 3PCF.

    - We are actively working on projects where we apply Machine Learning techniques (such as emulation with Deep Neural Networks and Gaussian Processes) using state-of-the-art hardware accelerators (GPUs) and software packages such as TensorFlow to build the framework for the implementation and analysis of 2PCFs and integrated 3PCFs. We are therefore also looking for students who are interested in working on Machine Learning techniques in cosmology.

  • Astrophysics and Cosmology with Machine Learning

    Contact: Stella Seitz (stella@usm.lmu.de), Tamas N. Varga (vargatn@usm.lmu.de), Laurence Gong (lgong@usm.lmu.de), Anik Halder (ahalder@usm.lmu.de)

    We offer various Machine Learning (ML) Master's theses at any time, e.g. in the context of weak lensing statistics (see above), photometric redshifts with ML, statistical descriptions of galaxy clusters with ML, identification of rare or 'weird' objects with ML. We also offer topics using Convolutional Neural Networks (CNN) in astrophysics and cosmology.

  • Constraining the Mass Distribution and Cosmological Parameters with Galaxy Strong Lensing

    Contact: Giacomo Queirolo (queirolo@usm.lmu.de), Stella Seitz (stella@usm.lmu.de), Arno Riffeser (arri@usm.lmu.de)

    General Relativity predicts that light paths follow geodesics, determined by the curvature of space, and hence are not "straight lines" in general. If mass distributions are sufficiently dense, as they are in the cores of galaxies, they can strongly lens other galaxies or QSOs in their background. In such cases, we can see the background objects several times ("multiple images") and in the case of a lensed galaxy, we can see it strongly distorted into "arcs". One can use the imaging data of such strong lensing galaxies to measure the mass distribution of the lensing galaxy. Usually, this is done using parametrized lens mass models, and then obtaining the best fitting parameters and its confidence limits with Monte Carlo Markov Chains.
    If the multiply lensed background object is a QSO, then one can also determine the "size" of the Universe or its "age". This is because the fluxes of QSOs usually vary with time. Suche variations are then observed in the multiple images with some time lag, which is determined by the geometrical and potential depth differences along the different lines of sight (determined by the strong lensing model) and by the Hubble constant. Hence, if the time delay is measured and the lens galaxy is constrained by the strong lensing model, the Hubble constant can be measured, too.

    We offer 3 projects, which will not only yield very worthwhile and desirable results but also will allow you to acquire skills that (in addition to your analytical skills) will make you a top candidate to continue your career in science as a doctoral student. Depending on your progress and your interests, the projects below can also be mixed (such that you work on two of them at a time).

    You are a good candidate for such projects if you have a basic knowledge of python programming, and have a strong interest in cosmology and astrophysics. Ideally, you have attended an overview astrophysical lecture and a cosmology lecture already. Even if you have a physical background "only" you are a good candidate if you are highly motivated to learn and solve problems, if you like to work in a group, to receive guidance but also share your experience with other group members, and if you are highly motivated. In our regular seminars, you will learn how to present your results or the content of publications, but also how to discuss a particular problem and topic in astrophysics. Our group also has experience in Machine Learning, which we regularly apply when suitable.

    We offer three different kinds of master thesis topics, which can be also combined depending on the level of experience of the student, and the progress.

    Galaxy Strong Lensing Project 1: How to obtain a Strong Lensing Model of a galaxy strong lensing system which was imaged with the Hubble Space Telescope (HST).

    -We will teach you how you can use a public state-of-the-art strong lensing program "lenstronomy" to model the mass distribution of the lens.
    -We will show you how to measure the 2-dimensional light profile of the lensing galaxies, also using state-of-the-art Galaxy Light fitting programs.
    -You will learn what the properties of (lensing) galaxies are regarding the spectral energy distribution (SED) and their mass distribution, composed of luminous (stars) and dark matter, and how these dark and luminous mass components are constrained.
    -You will learn to use Monte Carlo Markov Chains to determine confidence intervals for parameters and to identify degeneracies in fitting parameters.

    Galaxy Strong Lensing Project 2: How to obtain the Gravitation Lensing Time Delay and finally the Hubble Constant from a galaxy strong lensing system that was monitored with our Wendelstein 2m Telescope over 2 years.

    We have been observing a strong lensing galaxy system with our Wendelstein 2m Telescope for about 2 years. In this case, a QSO is multiply imaged, and it is variable over time. The data have been reduced almost up to the present day. You will measure the light curves of the individual QSO images. By comparing them you will be able to measure by how many days they are shifted relative to each other. Using results from the Strong Lensing 1 project you then can determine the Hubble Constant.

    -If you are interested, you will have the possibility to go observe with our 2m Wendelstein Telescope regularly. We will show you how to do this -- it is great fun!
    -We will teach you how to reduce and analyse newly incoming data from our Wendelstein observatory.
    -You will learn how one accounts for the observational effects like points spread function ("seeing") when images observed at different times are compared in order to measure the flux differences for the multiply imaged QSOs. The technique is called "difference imaging". This technique is also used in order to identify transient planets or to identify extragalactic transient events like supernovae, counterparts of Gamma Ray Bursts, or Gravitational Wave events, and hence is very versatile in its application.
    -We will introduce you to a state-of-the-art public python software that correlates the light curves from the multiple images in order to obtain a time delay estimate.
    -You will find out what "microlensing" is about, how it complicates the time delay measurement, and how you can overcome this complication, but also how Microlensing can be we used to learn something about the lensing galaxy and the lensed QSO.
    -Finally, you will derive the gravitational time delay of the observed strong lensing system and combine it with the strong lensing model to obtain a Hubble Constant estimate.

    Galaxy Strong Lensing Project 3: What happens if Galaxies do not behave as we describe them in an idealized way? Triaxial versus "simple" elliptical lens models and their implication of lensing time delay Hubble constant estimates.

    This will be in collaboration with the MPE-OPINAS part of our group; Contact: Roberto Saglia (saglia@usm.lmu.de), Jens Thomas (jthomas@mpe.mpg.de)

    When we describe the strong lensing effect of galaxies (and galaxy clusters) we usually do this with relatively simple, idealized parametrized models: we describe the projected mass distribution as having elliptical iso surface density contours. This is motivated by the fact that the light distribution of elliptical galaxies at first view has elliptical surface brightness contours, too, (and the same might be true for the dark matter, too) and by the fact that these simplified lens models describe the observations of the lensing effect (multiple image positions, distortion into extended arcs) very well. So keeping this simplified assumption is driven by its "success" and by the fact that one would always like to keep the number of free parameters of a system as small as possible and as large as necessary.

    But when we look closer or think more about it, it is clear that this is an idealized assumption: the light distribution of observed elliptical galaxies changes from the inside out, in the sense, that the ellipticity values of the surface brightness contours and the major axes angle change ("twists"). Also, if one imagines galaxies to be 3-dimensional ellipsoids (in terms of the 3 dimensional light and the 3-dimensional total matter density contours) then the 2D projections along the line of sight (which observe in the light and which we are sensitive to in strong lensing)
    only have constant ellipticities and angles, if two of these axes are the same (we look at an "oblate" "mandarine" or a "prolate"/"cigar" like structures). In general galaxies are 3-dimensional objects. And projecting a 3-dimensional density ellipsoid with axes a>b>c into two dimensions yields surface brightness and surface mass density contours that change with radius in terms of ellipticity and major axis angle.

    One uses strong lensing galaxy systems and their observed gravitational lens time delays in order to estimate the Hubble constant. This is usually done for a sample of such systems, in order to decrease the statistical error. Currently, about 7 such systems have been used and once combined, an error of a few percent for the Hubble constant was claimed. The value has been supporting previously claimed high Hubble constant estimates in contrast to the low Hubble constant estimates from the microwave background fluctuations (CMB) of the Planck satellite mission. This is called the "Hubble tension".
    However, combining several systems in order to increase the statistical error (and increase the precision) is only possible if the systematic error budget is under control because they could bias the results (and may make the values for H0 artificially high). One of the systematic errors one can make is using a too simplified lens model, like the assumption that the ellipticity and the major axis angle of the projected surface density if the lensing galaxies' mass distribution is constant. This shall be investigated in this project. You will take a realistic lens mass distribution characteristic of a triaxial mass distribution in projection and predict strong lensing observables like the position and flux ratios of multiply imaged QSOs and their time delays for a given Hubble constant. You then will take your predicted positions and analyze your own "mock data" pretending that you don't know about this complication, and just use a simplified elliptical mass distribution. You will be able to quantify the differences between the 'true' Hubble constant used for your mock data, and the 'derived' Hubble constant used from the analysis of your mock data. So you do a 'synthetic' experiment (where you know a perfect truth) and derive the biases for Hubble constant estimates based on simplified assumptions for the lens model.

    -You will learn a lot about strong lensing theory and parametrized descriptions of mass distributions. These mass distributions are also used for galaxies' dynamical analyses, and to describe clusters of galaxies and hence are of wide general use.
    -We will help you to understand and change the public "lenstronomy" software such that it produces the mock "observations" of your artificial lenses.
    -You will learn how to predict the projected quantities of a 3-dimensional structure, regarding the differences in the line of sight integrated gravitational potential or lensing deflection angles.
    -We will show you how you obtain confidence levels of parameters with Monte Carlo Markov Chains.
    -You will find out, whether and in which cases the 'traditional" description of lensing galaxies (and clusters) as projected systems with constant ellipticities is enough, and how to account for the usually unknown 3D structures when estimating Hubble constants (increasing errors and potentially biasing the results). You will experience also, the use and impact of priors in Bayesian statistics. Such experiences are extremely valuable in any modern (Astro)physical measurement, in particular, if data of different origins are combined.

  • On measuring gravitational weak lensing in simulated cluster imaging data

    Contact: Tamas N. Varga (vargatn@usm.lmu.de)

    Clusters (and groups) of galaxies are the largest gravitationally bound structures that exist in our universe. We like to study them for several reasons: Dense environments like groups and clusters of galaxies lead to a change of galaxy properties, because when galaxies fall into these gravitational potentials during their built-up and during their further growth, then, very quickly their star formation is quenched and galaxies turn from "blue" star-forming spirals into red S0s and elliptical galaxies in which stars only passively age, and hence they develop "red" spectral energy distributions (SEDs). On the other hand, the time and mass evolution of collapsed structures are sensitive to the density fluctuations present and measurable in the early Universe (CMB) and it depends on the expansion history of the Universe, which is governed by the so-called cosmological parameters like dark matter and dark energy content of the Universe and the dark energy equation of state. We think we can predict the number density of such clusters and groups relatively easily because they are mostly determined by the gravitational collapse (i.e. gravity, which we believe to understand) and by the above-mentioned cosmological parameters. Hence, turning around, and measuring the numbers and masses of galaxy clusters and galaxy groups is agreed to be a sensitive tool to constrain cosmological parameters and to confirm or exclude our theory of gravity (GR).

    The task to this hence boils down to counting collapsed structures (clusters) and measuring their masses.

    Regarding the second part, weak gravitational lensing is agreed to yield mass measures with relatively large scatter but small systematic errors, and hence, stacking of many clusters promises to yield an unbiased mean of the mass of cluster samples. When we do an analysis of the weak lensing effect of clusters, we have to identify the galaxies in the background of the clusters and measure how much they are distorted by the weak gravitational lens effect due to the clusters in their foreground. The effect typically changes the axis ratio by one percent or so! So, we measure a small effect and we would like to measure this precisely. For this reason, it is invaluable if the whole measurement process can be studied in a simulated 'mock experiment.

    We are members of the Dark Energy Survey and the Vera Rubin / LSST collaboration. In the first collaboration, we studied and lead analyses of the weak lensing effect of clusters (McLintock&Varga/ Varga). In the second case, the survey is soon to start. For the new survey, we develop a simulation tool for clusters. We start from a statistical description of clusters of galaxies and line of sight galaxies, i.e. we know the probability distributions regarding the magnitudes, colors, morphologies, and radial distribution of galaxies along cluster lines of sight, and we can generate (by dicing) an arbitrary number of such realizations (at a catalog level). Once we generate such a catalog we then can create images of such fields (i.e. mock data, which look like the clusters we will observe in the near future with LSST). These data can then be analyzed by us and our international colleagues. We can measure the strength of the weak lensing effect and we can compare it with the inserted true value. Then we know, how "good" we are! In addition, we can test our galaxy selection methods: do we identify the right galaxies in our 'background' catalog, and do we also recover the correct fluxes or magnitudes? How do the errors depend on the position within the cluster? We expect that the identification, flux, and shape measurement becomes more difficult towards the center of clusters because galaxies become rather crowded, i.e. influence their detections and flux estimates, and because intra-cluster lights add a smooth extended light component to the light distribution associated with the galaxies.

    Your project will be part of an ongoing effort in our group.

    -You will learn about the theory of the weak lensing effect and all the little details to estimate a weak lensing cluster mass at the end.
    -you will find out what the properties of clusters of galaxies are, regarding their galaxy population (SED, morphology), radius and "richness" and redshift
    -We will show you how you can create (fits images) of mock galaxies and you can see how they compare with images from already observed clusters, like in the DES. You will use the public state-of-the-art tool "Galsim" for this purpose.
    -You have the possibility to apply a state-of-the-art measurement method of how our galaxies respond to gravitational shear. You can compare this to the gravitational shear you inserted into your simulations.
    -Once you have estimated the error (bias and statistical error), you can quantify whether true masses of clusters of galaxies are recovered. This work will bring you to the forefront to actually do the same analysis in the final DES data or in the first LSST data, once you start a PhD project at one of the many member institutes in these large collaborations.

Instrumentation and observational projects 

Contact: Ulrich Hopp (hopp@usm.lmu.de), Claus Gössl (cag@usm.lmu.de), Arno Riffeser (arri@usm.lmu.de), Frank Grupp (fug@usm.lmu.de), Hans-Joachim Hess (achim@usm.lmu.de), Florian Lang-Bardl (flang@usm.lmu.de)

  • Development of instrument control systems with Beckhoff SPS for large telescope systems

    DE: Diese Masterarbeit setzt Interesse an elektronischen Steuerungen und Sensorik voraus. Vorkenntnisse in Elektronik (u. U. entsprechende Master- oder Bachelorvorlesungen) und SPS-Technologien im besonderen sind von Vorteil. Im Rahmen des Baus des MICADO-Instruments für das 39-m-EELT-Teleskop in Chile sind diverse Mechanismen und elektronische Steuerungskomponenten zu entwickeln, bauen und zu testen. Mechanismen und Sensorik müssen in einem Test-Kroystat (~80 K) an der USM getestet und von Beckhoff-SPS gesteuert werden. Die Arbeit umfasst die Konzipierung, Durchführung und Dokumentation von Tests diverser Hardware bei Raumtemperatur und bei ~80 K in unserem Kryostaten. Ergebnisse müssen aufbereitet werden um im Rahmen des MICADO-Projekts von anderen internationalen Konsortiumspartnern verwendet zu werden. Als Steuerungselektronik werden Beckhoff-SPS eingesetzt. Zusätzlich kann je nach genauem Thema ein rein astrophysikalisches Beobachtungs- und/oder Datenauswertungsprojekt in Zusammenarbeit mit dem Wendelstein-Observatorium absolviert werden.

For more topics offered at the OPINAS part of our group, please visit: https://www.mpe.mpg.de/1761897/Master-_und_Doktorarbeiten