Information & Data Science Pilot Projects
The Helmholtz Association has therefore specifically strengthened its expertise in the field of Information & Data Science.
Pilot projects 1
The Helmholtz Association has therefore specifically strengthened its expertise in the field of Information & Data Science. Five highly innovative research projects received funding totaling 17 million euros for three years in 2017.
To the press release (In German)
The pilot project "Helmholtz Analytics Framework" will strengthen the development of data science in the Helmholtz Association. The project pursues a systematic development of domain-specific data analytics techniques in a co-design approach between domain scientists and information experts. In particular, the exchange of methods between individual scientific domains should lead to generalization and standardization. In challenging use cases such as Earth system modeling, structural biology, aerospace, medical imaging, or neuroscience, this creates the potential for scientific breakthroughs and new knowledge. The Helmholtz Analytics Framework cooperates closely with the already established Helmholtz Data Federation (HDF).
Further information and contact: Prof. Dr. Dr. Thomas Lippert (FZ-Jülich), th.lippert(at)fz-juelich.de, Prof. Dr. Achim Streit (KIT), achim.streit(at)kit.edu, Björn Hagemeier (FZ-Jülich), b.hagemeier(at)fz-juelich.de, Daniel Mallmann (FZ-Jülich), d.mallmann(at)fz-juelich.de.
The pilot project "Sparse2Big" provides methodological and technical foundations for dealing with Big Data. The goal is to create truly usable "Big Data" from sparsely observed, large data sets through imputation (completion) and robust modeling of the observation processes. The project will initially focus on datasets in single-cell genomics, using modern genome sequencing techniques to analyze individual cells: This will provide researchers with a "molecular microscope" with multiple applications, such as in developmental biology, cancer diagnostics, and stem cell therapy. The innovative techniques of Sparse2Big will contribute to a significant improvement of observations in single cell genomics and thus bio-medical research. Based on this, a transfer of these methods to other research areas is already being prepared.
Complex mathematical computer models are used in the fields of climate and environmental research, health research, energy research and in the development of robots. The flood of data from these models poses enormous challenges for researchers and computing centers, and new intelligent methods must be developed to overcome them. This is where the Reduced Complexity Models pilot project comes to the rescue by developing reduced complexity models derived from intelligent new methods in the information sciences. The project focuses on quantifying the uncertainties of simulations, developing simplified models, and identifying key dependencies. The goal is more stable models and more resilient simulations for myriad applications. Specific challenging examples will be used to develop and test these new concepts as well as methods. The project will drive the development of interoperable and reusable technologies and thus contribute to the faster solution of future problems. In this way, it can also develop considerable societal potential.
Further information and contact: Prof. Dr. Corinna Schrum (HZG), corinna.schrum(at)hzg.de
The pilot project "Automated Scientific Discovery" will bring forth completely new technologies to automatically discover relationships in large amounts of complex scientific data. To this end, the Helmholtz researchers are using highly innovative and reliable artificial intelligence methods. Initially, the project partners will use these methods to advance knowledge about nuclear fusion processes and Earth observation. In further steps, the scientists will combine basic research, applied research and the development of generic methods. The development of generic methods will be strengthened by close cooperation with the pilot project "Helmholtz Analytics Framework".
Further information and contact: Dr. Jakob Svensson (IPP), jakob.svensson(at)ipp.mpg.de
Imaging techniques are an essential source of information in almost every field of research. The pilot project "Imaging at the Limit", which will initially receive start-up funding, focuses on those aspects of image reconstruction that transform measurements into actual images. Among other things, clever exploitation of the information content in measurement datasets makes it possible to improve the efficiency of reconstruction and thus the usability of visualized data. The project aims to improve existing imaging methods and related research fields in a forward-looking way.
Pilot projects 2
The incubator continues to provide impetus to strengthen the community's information- and data-based research. In this way, the incubator catalyzes the development of pioneering projects that transcend the usual disciplinary and research field boundaries. With a second call for pilot projects in 2019, Helmholtz is investing an additional total of over 40 million euros in the future of research.
Climate change is particularly affecting the polar and permafrost regions through rising temperatures. Melting of ice sheets and thawing of permafrost are immediate consequences, leading among other things to a rise in sea level. These developments pose societal challenges that need to be quantified and understood. The Artificial Intelligence for Cold Regions (AI-CORE) project is a collaborative approach to unlock Artificial Intelligence (AI) methods for cryospheric research. The German Aerospace Center (DLR) will develop these together with the Alfred Wegener Institute (AWI) and the TU Dresden and make them available to the Helmholtz community as part of a platform.
Further information and contact: Dr. Andreas Dietz (DLR), Andreas.Dietz(at)dlr.de
How will the climate develop, how secure is our energy supply, and what opportunities does molecular medicine offer? Rapidly growing amounts of data open up fundamentally new possibilities for answering current questions from society, science and industry; however, data, findings and predictions are inevitably associated with uncertainty. The goal of the Uncertainty Quantification project is to understand this through methods of probability theory and to incorporate it into research and communication. The project links applied researchers from the four research areas Earth & Environment, Energy, Health and Information with each other as well as with Helmholtz data scientists and external university partners from mathematics and econometrics.
The Pilotlab Exascale Earth System Modeling explores specific concepts for earth system models on exascale supercomputers. So-called extreme events - such as hurricanes, droughts or heavy rain triggered by climate change - can lead to drastic changes in society and the environment. However, current climate models are not yet precise enough, especially when it comes to simulating such events, and they need to be much more finely resolved. However, the computing power of current supercomputers cannot simply be increased - among other things, this would require far too much energy. Therefore, fundamentally new modeling concepts are needed. In PL-EESM, scientists and computer scientists are working together to develop the necessary software and new hardware concepts.
Further information and contact: PD Dr. Martin Schultz (Forschungszentrum Jülich), m.schultz(at)fz-juelich.de
Ptychography is a computational method that exploits correlated measurements to reconstruct an object based on diffraction images. The use of this 'virtual lens' allows microscopic imaging to be pushed beyond the limits of classical optics. The method has recently gained tremendous interest. This is based on the availability of an iterative algorithm to solve the reconstruction problem and sufficient computational capacity to handle large amounts of data and high computational complexity. The project addresses the challenge of bringing ptychography to routine use with different radiation sources (X-rays, electrons, XUV light). To do so, optical expertise is combined with data science. Ptychography 4.0 separates data collection and processing so that resources are used where they are most appropriate.
Satellite technology and especially GPS or GNSS technologies are becoming increasingly important for our society. However, plasma density structures in near-Earth space can significantly affect the propagation of GPS signals and thus the accuracy of GPS navigation. In addition, space plasmas can also damage satellites. To study these effects of the space environment, it is important to develop an accurate model of plasma density based on a variety of direct and indirect measurements.
In this pilot project, we will demonstrate how machine learning methods can be used to build a real-time global empirical model of near-Earth plasma density based on a large number of measurements. The model we will develop following this project will then take advantage of all available data and be used by a wide range of stakeholders for GPS navigation and satellite operations.
In all areas of science and society, big, complex, and high-dimensional data is ubiquitous today. Machine learning and AI methods are already very effective at using such data for predictions.
The SIMCARD project will develop novel machine learning methods that are robust and reliable and go beyond simple predictions. The focus is on new methods for modeling very large networks and predictive reliability. The goal is to provide answers to pressing problems in diverse application domains with precisely fitting scalable, sound and interpretable Data Science methods. The project addresses in particular the fields of data-intensive biomedicine and weather forecasting.
To meet future challenges, data, computing power and analytical expertise must be brought together on an unprecedented scale. Conventional analytics systems are often centralized and thus have significant drawbacks from a technical, legal, political and ethical perspective, esp. due to complex security or trust requirements. As an alternative, the interdisciplinary project will facilitate the establishment of decentralized, collaborative data analysis architectures by bringing algorithms to the data in a trustworthy and legally compliant manner. TFDA will address the technical, methodological, and legal aspects necessary to ensure trustworthiness of analysis and transparency of analysis inputs and outputs. These developments will be validated in the use case of a federated radiation therapy study; then the results will be disseminated.
For more information: https://tfda.hmsp.center