news & press releases

ENABLING DYNAMIC AND INTELLIGENT WORKFLOWS FOR HPC, DATA ANALYTICS AND AI CONVERGENCE

Date: June 09, 2022

Eflows4HPC launches its reference publication

The publication focuses on the challenges of the lifecycle management of complex workflows that integrate HPC simulations with Data Analytics and AI algorithms. Eflows4HPC researchers analyse several use cases that employ these complex workflows, such as climate modelling and the prototyping of complex manufactured objects. The publication also includes the first version of the eFlows4HPC software architecture and the HPC Workflow as a Service methodology. Eflows4HPC researchers aim to demonstrate the workflow software stack through the use cases of three application Pillars that are of high industrial and social relevance: manufacturing, climate and urgent computing for natural hazards.

Digital twins in manufacturing

Eflows4HPC contributes to the use of workflows in the development of Digital Twins in manufactured projects.The Eflows4HPC team leverages the expressiveness of the PyCOMPSs programming model by combining the concurrent, although not trivially parallelizable, tasks that are needed in the construction of the engineering workflow. Using state-of-the-art workflow management system ensures that the Pillar workflow can effectively take advantage of large-scale computing systems, thus removing existing bottlenecks in terms of computational time and available system memory. The availability of a reliable deployment pipeline for the developed software guarantees that project outcomes can be effectively packaged and deployed on a variety of test systems, thus greatly enhancing the potential impact of the newly created twins.

Advancing climate modelling

Eflows4HPC seeks to improve execution efficiency and enable scientists to tackle more complex problems in end-to-end Earth System Modelling (ESM) workflows. To do so, we explore novel dynamic solutions, and we integrate data-driven approaches (HPDA, ML/AI) together with compute-driven components.

Urgent computing for natural hazards

Earthquakes and tsunamis are unpredictable and devastating events that can have catastrophic socioeconomic impacts. Urgent Computing (UC) for Natural Hazards emerged thanks to rapid increases in computational power, new technologies developed for data monitoring systems and advanced analytical techniques. UC links High Performance Computing , state-of-the-art simulation codes, readily available data and High Performance Data Analytics (HPDA) to provide insights into the impacts and potential damages immediately following an extreme event.  Developing UC workflows for earthquakes and tsunamis involves deploying advanced tools and complex tasks to ultimately bring them to an operational level. UC workflows take advantage of the eFlows4HPC software stack to improve technology and provide fast solutions for mitigating the effects of potentially catastrophic earthquakes and tsunamis.