Our approach in fluid dynamics research is broad, using theoretical, computational, and statistical techniques, with an emphasis on developing a hybrid modeling framework, combining physics-based models with the versatility of data-driven approaches.

Sponsored Research

Physics-reinforced Machine Learning Algorithms for Multiscale Closure Model Discovery

Sponsor: DOE Early Career Research Program Award DE-SC0019290

Period: 2018-2023

Advances in artificial intelligence have led to a renaissance in learning and extracting patterns from complex data. Despite successes in other areas, applying machine learning techniques in the field of fluid mechanics is relatively new. Most efforts are primarily focused on finding parameters for existing turbulence models. This project explores a big data approach that learns from physical constraints without assuming any heuristics for underlying turbulence physics. Our overall research program will develop novel physics-reinforced data-driven approaches for geophysical turbulence. The research will also involve deep learning approaches that can discover closure models for complex multiscale systems. Research insights will facilitate the building of improved numerical weather prediction models and better parameterization strategies for DOE mission-relevant challenges.

Collaborative Research: Data-Driven Variational Multiscale Reduced Order Models for Biomedical and Engineering Applications

Sponsor: NSF DMS-2012255

Collaborators: Traian Iliescu (Virginia Tech), Alessandro Veneziani (Emory University)

Period: 2020-2023

Mathematical models are a fundamental tool for improving our knowledge of natural and industrial processes. Their use in practice depends on their reliability and efficiency. Reliability requires a fine-tuning of the model parameters and an accurate assessment of the sensitivity to noisy inputs. Efficiency is particularly critical in optimization problems, where the computational procedure identifies the best working conditions of a complex system. These requirements lead to solving many times models with millions or even billions of unknowns. This process may require days or weeks of computations on high-performance computing facilities. To mitigate these costs, we need new modeling strategies that allow model-runs in minutes to hours on local computing facilities (such as a laptop). Reduced order models (ROMs) are extremely low-dimensional approximations that can decrease the computational cost of current computational models by orders of magnitude. Having in mind biomedical and wind-engineering applications, this project proposes novel methods of model reduction. Data and numerical results from the expensive (or high-fidelity) models are combined with machine learning approaches, to obtain ROMs that attain both efficiency and accuracy at an unprecedented level. The new data-driven ROM framework will finally make possible the numerical simulation of aortic dissections, pediatric surgery, or wind farm optimization on a laptop in minutes, and aims at becoming a critical and trustworthy tool in decision-making processes.

Develop design criteria for psychrometric air sampler and mixer apparatus for use in ASHRAE test standards

Sponsor: ASHRAE (RP-1733) Technical Committee TC 8.11 - Unitary and Room Air Conditioners and Heat Pumps

PI: Christian Bach (OSU), co-PI: Omer San

Period: 2018-2020

The project scope covers developing the necessary testing methods for the mixers, developing new mixers and air samplers, developing their performance, and as a final step, evaluate overall in-situ performance of the newly developed devices with coil tests of 1.5 ton, 3 ton, and 5 ton size. The objective of this project is to provide (i) design recommendations for measuring bulk air conditions and (ii) methods for validating the performance of a sampler and mixer (where a mixer can be utilized) combination that would provide the most accurate bulk temperature and humidity measurement at indoor air inlet and indoor air outlet.

Effect of inlet duct and damper design on ASHRAE 37/116 fan performance and static pressure measurements

Sponsor: ASHRAE (RP-1743) Technical Committee TC 8.11 - Unitary and Room Air Conditioners and Heat Pumps

PI: Christian Bach (OSU), co-PI: Omer San

Period: 2017-2020

Investigation of the effects of inlet duct design onto fan power, optimum static pressure measurement location, and air flow profile at exit of inlet duct. This project offers specific guidelines for the design of the inlet ductwork to reduce the design space towards inlet duct designs that are known to have little effect onto the equipment performance, as shown by comparable outlet flow profiles obtained from a validated CFD method and experiments.

Blind deconvolution of massively separated turbulent flows

Sponsor: NASA Oklahoma Space Grant Consortium/NASA EPSCoR Research Initiation Grant

PI: Omer San, co-PI: Prakash Vedula (University of Oklahoma)

Period: 2017-2018

Turbulence models are a key tool in understanding complex flow physics in many applications. In this project, we have explored the feasibility of a heuristics-free turbulence modeling framework, with the goal of developing robust and accurate closure models for the coarse-grained simulations of flows in massively separated turbulent flow regimes.

Featured Research

Machine Learning for Fluid Dynamics: Turbulence Modeling

In the past few decades, an exponential increase in computational power, algorithmic advances and experimental data collection strategies have seen an explosion in modelling efforts which leverage information obtained from physical data. In our lab, we are developing physics-constrained machine learning tools to identify a nonlinear relationship between the filtered and unfiltered quantities to obtain reliable closure models for large eddy simulations. Our research led various mechanisms to exploit the strengths of different functional or structural closure modeling strategies while preserving trends from direct numerical simulations. In these studies, we developed several data-driven machine learning subgrid-scale models for large eddy simulations (LES) by considering both regression and classification points of view. The ongoing research in my group is to provide a basis to generate predictive technologies for a broad spectrum of closure modeling problems, which could facilitate improved numerical weather prediction and climate research tools.

Feature Engineering and Symbolic Regression

To address the key limitation of the black-box learning methods, we have exploited to use symbolic regression as a principle for identifying relations and operators that are related to the turbulence system dynamics. This approach combines evolutionary computation with feature engineering to provide us a tool to create new models. So far, our approach mainly involves gene expression programming (GEP) and sequential threshold ridge regression (STRidge) algorithms. We demonstrate our results in three different applications: (i) equation discovery, (ii) truncation error analysis, and (iii) hidden physics discovery, for which we include both predicting unknown source terms from a set of sparse observations and discovering subgrid scale closure models. We illustrate that both GEP and STRidge algorithms are able to distill the Smagorinsky model from an array of tailored features in solving the Kraichnan turbulence problem. Our results demonstrate the huge potential of these techniques in complex physics problems, and reveal the importance of feature selection and feature engineering in model discovery approaches.

Hybrid Analytics for Fluid Dynamics: Model Order Reduction

We are developing robust hybrid analytics approaches and algorithms combining physics-based models with machine learning frameworks for both intrusive and non-intrusive reduced order modeling paradigms of nonlinear and nonstationary systems. Analyzing the similarities between LES and ROM, we explore and exploit a series of physics-based, data-driven, and hybrid closure models to stabilize ROM emulators that enable more accurate near realtime predictions, specifically designed for systems involving general circulation models.

Digital Twins

A digital twin is defined as a virtual representation of a physical asset enabled through data and simulators for real-time prediction, monitoring, control and optimization of the asset for improved decision making throughout the life cycle of the asset and beyond. With the recent wave of digitalization, the latest trend in every industry is to build systems and approaches that will help it not only during the conceptualization, prototyping, testing and design optimization phase but also during the operation phase. In our lab, we exploit digital twin technologies to facilitate new computational models in systems relevant to fluid dynamics.

Wind Energy & Aviation Safety

Microscale terrain-induced turbulence impacts wind energy and aviation related activities significantly. Through collaborations with the Norwegian University of Science and Technology (NTNU) and the Computational Science and Engineering group at SINTEF in Norway, we are exploring the feasibility of a digital twin approach in the context of wind power production and aviation safety. In terms of basic science, we aim to provide an in-depth fundamental understanding of terrain-induced turbulence and wake-vortex dynamics. Our main effort consists of the development of a wind power forecast system for windfarms and a turbulence alert system for airports.

Big Data Cybernetics

In the context of upcoming technologies like digital twin, the role of cybernetics is to steer the system toward an optimal set-point. The difference, called the error signal, is applied as feedback to the controller, which generates a system input to bring the output set-point closer to the reference. With the availability of more and more sensors and communication technologies, an increasingly larger volume of data (in fact big data) is being made available in realtime. This concept offers many new perspectives to our rapidly digitized society and its seamless interactions with many different fields.

Interface Learning

While immense advances in computational mathematics and scientific computing have come to fruition, such simulations always suffer a curse of dimensionality limiting turnaround. In our lab, we introduce a machine learning framework to learn physically accurate interface boundary conditions without the need to unnecessarily resolve the whole computational domain. In other words, we aim at minimizing (indeed, preventing) communications from surrounding regions to the region we are most interested in. We advocate the development of such interface learning methodologies to model the information exchange at the interface, an emerging topic which will have far reaching impact on a large variety of problems in science and engineering.


We are grateful for the support from DOE, NSF, ASHRAE, NASA EPSCoR, and NVIDIA.