LSST will be a survey wider and deeper than all previous telescopes combined. It will measure the properties of tens of billions of galaxies, allowing us to measure the nature of the Universe to an unprecedented precision. However, to be able to take advantage of this vast quantity of data, we need to design new, more efficient and more accurate algorithms to make our measurements. Existing algorithms simply won't cut it. This is particularly true in gravitational lensing, where there is active development and competition within the community to come up with the best way of measuring the subtle distortion in galaxy shapes, caused by the gravitational field of matter in the Universe.
In this spirit, I've recently formed a collaboration between experts in theory, statistics, computing and astronomy to develop an entirely new way of measuring galaxy shapes. It is the first fully Bayesian approach to the problem of weak gravitational lensing, where galaxy shapes are distorted by a percent at most. It's a new collaboration, and right now we're just getting started but our results are promising! The project I want to work on over the summer will take our initial results and extend them to cover other problems specific for LSST. For example, how will this algorithm deal with the distortions imprinted on galaxy images by imperfections in the telescope and the atmosphere? This has direct implications for work being currently carried out at the Dark Energy Survey and at CFHT.
Since LSST is inherently a big-data project, this project will also involve intensive computing as we process simulated data. If the student is interested, we can also develop GPU algorithms to speed up the calculation.