Two prominent X-ray emission lines of highly charged iron have puzzled astrophysicists for decades: their measured and calculated brightness ratios always disagree. This hinders good determinations of plasma temperatures and densities. New, careful high-precision measurements, together with top-level calculations now exclude all hitherto proposed explanations for this discrepancy, and thus deepen the problem. Researchers from the GSI Helmholtzzentrum für Schwerionenforschung Darmstadt and the Helmholtz Institute Jena (HI Jena), a branch of GSI, are also involved in the investigations. The results are now published in the prestigious “Physical Review Letters” journal.
Hot astrophysical plasmas fill the intergalactic space, and brightly shine in stellar coronae, active galactic nuclei, and supernova remnants. They contain charged atoms (ions) that emit X-rays observable by satellite-borne instruments. Astrophysicists need their spectral lines to derive parameters such as plasma temperatures or elemental abundancies. Two of the brightest X-ray lines arise from iron atoms that have lost 16 of their 26 electrons, Fe16+ ions – also known in astrophysics as Fe XVII. Iron is rather abundant in the universe; it lets stars similar to our Sun burn their hydrogen fuel very slowly for billions of years by nearly stopping the energy flowing as radiation from the fiery fusion core to the, in comparison only mildly hot, stellar surface.
For more than forty years, X-ray astronomers have been bothered by a serious problem with the two key Fe16+ lines: the ratio of their measured intensities significantly disagrees with theoretical predictions. This also holds for laboratory measurements, but uncertainties in experiment and theory have been too large for settling the issue.
An international team of 32 researchers led by groups from the Max Planck Institute for Nuclear Physics (MPIK) and the NASA Goddard Space Flight Center has just published the outcome of its renewed massive effort to resolve this discrepancy. They have performed both the highest-resolution measurements thus far reported, and several top-level quantum-theoretical calculations.
Steffen Kühn, PhD student at MPIK and responsible for the setup, describes the effort: “To resonantly excite highly charged iron ions, we continuously generate them with our compact mobile electron beam ion trap (PolarX-EBIT) and irradiate them with X-rays from the PETRA III synchrotron at DESY. We find resonance with the lines by scanning the synchrotron energy over the range where they should appear and observing the fluorescence light. To handle the experimental data flow, we had colleagues from 19 institutions working at DESY, and painstakingly analysing and cross-checking results for more than one year.”
To make sure that everything is consistent, the researchers combined three different measurement procedures to determine the intensity ratio of the two Fe16+ lines, dubbed 3C and 3D. First, overall scans revealed line positions, widths and intensities. Second, the experimentalists set the energy of the X-ray photons to match the peak fluorescence yield while cyclically turning the photon beam off and on to get rid of the strong background. Third, they scanned the lines again, but using the on-off trick at the same time in order to reduce instrumental effects. “This way, we could derive the presently most accurate value of the brightness ratio, and this with ten times higher spectral resolution than earlier work”, states Chintan Shah, NASA postdoctoral fellow. “And the properties of the PETRA III beam avoided possible non-linear effects depending on the flux of synchrotron photons that may have affected earlier measurements”, adds Sven Bernitt, researcher at the Helmholtz Institute Jena and one of the project leaders, who is working in the group of Thomas Stöhlker, HIJ Director and Deputy Research Director of GSI and FAIR. Remarkably, the resulting intensity ratio confirms earlier astrophysical and laboratory measurements with much reduced uncertainty.
Theory teams around Natalia Oreshkina at the MPIK, from Australia, USA and Russia applied three independent very-large-scale relativistic quantum-theoretical methods, letting clusters of hundreds of processors run hot for weeks. This computational marathon delivered concordant results at high numerical precision. However, while the calculated energy difference between the two lines agrees well with the measured value, the intensity ratio clearly departs from the experimental result. “There are no other known quantum-mechanical effects or numerical uncertainties to consider within our approaches”, emphasizes Marianna Safronova, professor at the University of Delaware.
Thus, the cause for the discrepancy between the experimental and theoretical intensity ratios of the 3C and 3D lines of Fe16+ remains puzzling, since also all effects that could perturb the measurements were as far as possible suppressed, and the remaining uncertainty understood. As a consequence, astrophysical parameters derived on the basis of X-ray line intensities are, to some degree, uncertain. While this is unsatisfactory, “the new accurate experimental result may be immediately used to empirically correct the astrophysical models”, recommends Maurice Leutenegger, also a NASA researcher. “Upcoming space missions with advanced X-ray instrumentation, such as ESA's Athena X-ray Observatory, will soon start sending an incredible stream of high-resolution data to ground, and we have to be prepared to understand it and squeeze the maximum value from those billion-dollar investments.” (MPI/BP)
This press release with pictures is available here. This news is based on a press release of the Max Planck Institute for Nuclear Physics Heidelberg.