Invited Talk »Data Analysis and Management for High Resolution Solar Physics« at EST Meeting 2017 in Bairisch-Kölldorf, Austria

The GREGOR archive at AIP and it data analysis and management plan were presented as an invited talk at the EST Meeting in Bairisch-Kölldorf, Austria on 2017 October 10.

In high-resolution solar physics, the volume and complexity of photometric, spectroscopic, and polarimetric ground-based data significantly increased in the last decade reaching data acquisition rates of terabytes per hour. This is driven on the one hand by the desire to capture fast processes on the Sun and on the other hand by the necessity for short exposure times »freezing« the atmospheric seeing, thus enabling post-facto image restoration. Solar features move with velocities of several kilometers per second in the photosphere and several tens of kilometers per second in the chromosphere, often exceeding the speed of sound. Eruptive phenomena in the chromosphere reach even higher velocities in excess of 100 kilometers per second. The coherence time of wavefront distortions is of the order of milliseconds under daytime seeing conditions. Consequently, large-format and high-cadence detectors are nowadays used in solar observations to facilitate image restoration. Based on our experience during the “early science” phase with the 1.5-meter GREGOR solar telescope (2014–2015) and the subsequent transition to routine observations in 2016, we describe data analysis and data management tailored towards image restoration and imaging spectroscopy. We outline our approaches regarding data processing, analysis, and archiving for two of GREGOR’s post-focus instruments, i.e., the GREGOR Fabry-Pérot Interferometer (GFPI) and the newly installed High-Resolution Fast Imager (HiFI). The heterogeneous and complex nature of multi-dimensional data arising from highresolution solar observations provides an intriguing but also a challenging example for »big data« in astronomy – in particular when considering the next generation of 4-meter aperture solar telescopes. The big data challenge has two aspects: (1) creating a Collaborative Research Environment, where computationally intense data and post-processing tools are co-located and collaborative work is enabled for scientists of multiple institutes and (2) establishing a workflow for publishing the data for the whole community and beyond. This requires either collaboration with a data center or frameworks and databases capable of dealing with huge data sets based on Virtual Observatory and other community standards and procedures. We present working approaches for both.

Previous post:     Next post:

Write a comment Close comment form

Write a comment

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>.