Center for Data-Driven Computational Physics
Faculty affiliated with the Center for Data-Driven Computational Science are involved in a number of ongoing projects.
A Diagnostic Modeling Methodology for Dual Retrospective Cost Adaptive Control of Complex Systems (PI: Dennis Bernstein; co-PIs: M. Gamba, K. Duraisamy)
This project will extend data-driven adaptive control to systems that are beyond the capability of traditional adaptive control algorithms due to the extreme complexity of the physics. This will be accomplished by developing, demonstrating, and validating a novel diagnostic modeling methodology that that is based on limited sensor data to uncover the essential dynamics of the system. This project combines multidisciplinary expertise in adaptive control and system identification; computational fluid dynamics and data-driven modeling; and combustion dynamics and physics-guided diagnostics. The venue for developing, demonstrating, and validating the proposed diagnostic modeling methodology is experimental control of instability in lean premixed combustion.
Funded by NSF
Uncertainty Quantification of Microstructural Material Variability Effects (UM PI: K. Garikipati)
This project, in collaboration with Sandia National Laboratory, will develop data-driven models of continuum plasticity. The big data in this study will come from a very large number of crystal plasticity computations, and experiments, that take account of microstructural variability. A machine learning tier will connect these models with macroscale continuum plasticity code, and will incorporate bounds on uncertainty. This algorithmic framework will be developed on ConFlux.
Funded by Sandia National Labs
Scalable Environment for Quantification of Uncertainty and Optimization in Industrial Applications: (U-M PI: K. Duraisamy).
This project develops an integrated plan for performing uncertainty quantification (UQ) and design under uncertainty (DUU) that aggressively pursues new frontiers in scale and complexity. In particular, this project will create advancements in scalable forward and inverse UQ algorithms and the rigorous quantification of model inadequacy using data-driven approaches. This project provides a foundation for the development of generalized stochastic design approaches that address the robustness and reliability of complex multi-disciplinary systems. This is a collaborative effort with Sandia National Laboratories, Stanford University and Colorado School of Mines.
Funded by DARPA
Conflux, A Novel Platform for Data-Driven Computational Physics (PI: K. Duraisamy, co-PIs: K. Garikipati, B. Mozafari, A. Figueroa, G. Evrard)
This project develops a hardware/software ecosystem called ConFlux, specifically designed to enable High Performance Computing (HPC) clusters to communicate seamlessly and at interactive speeds with data-intensive operations. The project establishes a hardware and software ecosystem to enable large scale data-driven modeling of multiscale physical systems. ConFlux will produce advances in predictive modeling in several disciplines including turbulent flows, materials physics, cosmology, climate science and cardiovascular flow modeling. These applications require HPC applications (running on external clusters) to interact with large data sets at run time. ConFlux provides low latency communications for in- and out-of-core data, cross-platform storage, as well as high throughput interconnects and massive memory allocations. The file-system and scheduler natively handle extreme-scale machine learning and traditional HPC modules in a tightly integrated workflow—rather than in segregated operations–leading to significantly lower latencies, fewer algorithmic barriers and less data movement.
Funded by NSF
Formalisms and Tools for Data-driven Turbulence Modeling (PI: K. Duraisamy)
The goal of this research is to devise rigorous mathematical techniques that utilize experimental and simulation data to develop predictive models of turbulent flow. An important aspect of this approach is that the data is processed in the context in which it is needed for prediction. Domain-specific machine learning techniques are used to convert information to modeling knowledge. In essence, the inverse solution infers functional deficiencies in the model and machine learning is used to reconstruct the missing functional form. Objectives include investigation of how to identify and formulate a properly-posed data-driven-turbulence-modeling problem, the implications that these approaches have in more general data-driven computational physics applications, and the most effective ways to use machine learning in a predictive physics setting. Applications to be explored include transition to turbulence, thermal transport, and near-wall turbulent stress closures. The proposed work is expected to result in improved closure models for Reynolds-Averaged as well as hybrid Reynolds-Averaged/Large Eddy simulations.
Funded by NSF
This project develops a framework to utilize large-scale data for predictive modeling. It involves the development of domain-specific learning techniques suited for the representation of turbulence and its modeling, the establishment of a trusted ensemble of data for the creation and validation of new models, and the deployment of these models in complex aerospace problems. This is a collaborative effort between the University of Michigan, Stanford University, Iowa State and Pivotal Inc., consulting with Boeing Commerical Airplanes and interacting with NASA Langley Research Center.
Funded by the LEARN (Leading Edge Aeronautics Research for NASA) program, through the NASA Aeronautics Research Institute (NARI)
Integrated computational framework for designing dynamically controlled alloy-oxide heterostructures (PIs: E. Marquis & K. Garikipati)
This project will develop an openly distributable framework that rigorously integrates theory, experiment and computation to predict and elucidate the evolution of complex materials heterostructures. A central challenge is linking the electronic structure of the constituent chemistries of a complex materials system to its behavior at technologically relevant length and time scales.
Funded by NSF
Developing a Theory of Spatially Evolving Turbulence for Cardiovascular Flows (PIs: A. Figueroa, E. Johnsen, D. Dowling)
Turbulence is present in pathologic cardiovascular conditions such as aortic coarctations, aneurysms, and arterio-venous fistulas. As Direct numerical simulation of such flows is prohibitively expensive, there is a pressing need to develop turbulence models that take into account the complex spatially (and temporally) evolving nature of blood flow.
We have recently developed a mixed laminar-turbulent model that has the potential of being extrapolated for spatially evolving turbulent structures such as those seen in cardiovascular flows. If successful, this model would eventually enable the computation of complex blood flows in a clinically relevant timeframe in current hardware.
Funded by UM
Mechanisms Underlying the Progression of Large Artery Stiffness in Hypertension (PI: A. Figueroa)
Central artery stiffening is a well-established initiator and indicator of cardiovascular disease; it arises in hyper-tension, aging, diabetes, obesity, connective tissue disorders such as Marfan syndrome, organ transplantation, and the treatment of AIDS patients. Such stiffening contributes to heart disease and end-stage kidney failure. This project will use four diverse mouse models of hypertension and computer models to identify improved methods of diagnosing and treating central arterial stiffening, which promises to improve
Funded by NIH