diff --git a/Algebra-and-Number-Theory/Awards-Algebra-and-Number-Theory-2024.csv b/Algebra-and-Number-Theory/Awards-Algebra-and-Number-Theory-2024.csv
index dca92b8..8da2d1e 100644
--- a/Algebra-and-Number-Theory/Awards-Algebra-and-Number-Theory-2024.csv
+++ b/Algebra-and-Number-Theory/Awards-Algebra-and-Number-Theory-2024.csv
@@ -1,9 +1,12 @@
"AwardNumber","Title","NSFOrganization","Program(s)","StartDate","LastAmendmentDate","PrincipalInvestigator","State","Organization","AwardInstrument","ProgramManager","EndDate","AwardedAmountToDate","Co-PIName(s)","PIEmailAddress","OrganizationStreet","OrganizationCity","OrganizationState","OrganizationZip","OrganizationPhone","NSFDirectorate","ProgramElementCode(s)","ProgramReferenceCode(s)","ARRAAmount","Abstract"
"2401382","Building Blocks for W-algebras","DMS","ALGEBRA,NUMBER THEORY,AND COM","09/01/2024","08/07/2024","Andrew Linshaw","CO","University of Denver","Standard Grant","James Matthew Douglass","08/31/2027","$204,985.00","","andrew.linshaw@du.edu","2199 S UNIVERSITY BLVD RM 222","DENVER","CO","802104711","3038712000","MPS","126400","","$0.00","Vertex operator algebras (VOAs) arose in physics in the 1980s as the symmetry algebras of two-dimensional conformal field theories (CFTs) and were first defined mathematically by Borcherds. They have turned out to be fundamental objects with connections to many subjects including finite groups, Lie theory, combinatorics, integer partitions, modular forms, and algebraic geometry. W-algebras are an important class of VOAs that are associated to a Lie (super)algebra g and a nilpotent element f in the even part of g. They appear in various settings including integrable systems, CFT to higher spin gravity duality, the Allday-Gaiotto-Tachikawa correspondence, and the quantum geometric Langlands program. In this project, the PI will investigate the structure and representation theory of W-algebras. This will advance the subject and provide research training and collaborative opportunities for graduate students and postdocs.
In more detail, principal W-algebras (the case where f is a principal nilpotent) are the best understood class of W-algebras. They satisfy Feigin-Frenkel duality, and in classical Lie types they also admit a coset realization which has numerous applications to representation theory. It turns out that both Feigin-Frenkel duality and the coset realization are special cases of a more general duality which was conjectured by Gaiotto and Rapcak and proven recently by the PI and Creutzig. It says that the affine cosets of certain triples of W-algebras are isomorphic as 1-parameter VOAs. These cosets are known as Y-algebras in type A, and orthosymplectic Y-algebras in types B, C, and D. The Y-algebras can all be obtained as 1-parameter quotients of a universal 2-parameter VOA, and they are conjectured to be the building blocks for all W-algebras in type A. The orthosymplectic Y-algebras are quotients of another universal 2-parameter VOA, but they are not all the necessary building blocks for W-algebras in types B, C, and D. The main goals of this project are (1) to identify the missing building blocks, which we expect to arise as quotients of a third universal 2-parameter VOA; (2) to prove that W-algebras of all classical types can be obtained as conformal extensions of tensor products of building blocks. Special cases will involve W-algebras with N=1 and N=2 supersymmetry, and the PI hopes to prove some old conjectures from physics on coset realizations of these structures. Finally, the Y-algebras and other building blocks admit many levels where their simple quotients are lisse and rational. Exhibiting W-algebras at special levels as extensions of building blocks will lead to many new rationality results.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2401498","Birational Geometry, Hodge Theory and Singularities","DMS","ALGEBRA,NUMBER THEORY,AND COM","09/01/2024","08/08/2024","Mihnea Popa","MA","Harvard University","Continuing Grant","James Matthew Douglass","08/31/2027","$222,345.00","","mpopa@math.harvard.edu","1033 MASSACHUSETTS AVE STE 3","CAMBRIDGE","MA","021385366","6174955501","MPS","126400","","$0.00","This project addresses problems of fundamental interest in pure mathematics, and especially in the field of algebraic geometry. This area of study is one of the oldest in mathematics, but also one of the most active currently. The past decade or so has seen some of the most outstanding modern developments and connections with areas of pure and applied science. The purpose of this project is to continue one such modern development, namely the application of methods based on the theory of so-called mixed Hodge modules to questions in higher dimensional complex geometry. These objects are the outcome of an intricate mix of algebra, analysis, and topology, and can be used to prove new results about geometric shapes and singularities. In particular, the PI develop new results about basic invariants of geometric objects (some of the key words here are the Kodaira dimension, or the local cohomological dimension). In the broader sense, the PI is involved with the mathematical community through his work on editorial boards, scientific boards and AMS committees, and through his lectures and expository notes prepared for US and international events. This project will provide research training opportunities for students.
In more detail, the PI will continue studying questions in complex birational geometry and singularity theory, especially through the use of Hodge theory and D-modules. He will make further progress on the study of the Hodge filtration on the local cohomology of arbitrary subschemes of smooth varieties, and on the closely related theory of higher Du Bois and higher rational singularities. The general context requires significant new ideas compared to in the case of local complete intersections, which is by now rather well understood. This program will lead to new applications, similar to what the theory of multiplier ideals, and more generally Hodge ideals, produced in the case of hypersurfaces. In a different direction, the PI has recently proposed a conjecture on the superadditivity of the Kodaira dimension for morphisms between smooth complex projective varieties, and more generally an additivity conjecture for smooth projective morphisms between quasi-projective varieties; they complement Iitaka's well-known subadditivity conjecture, a problem of central importance in birational geometry. Some results have already been obtained, and the PI will make further progress on these conjectures, for instance by showing that additivity holds when the general fiber of the morphism admits a good minimal model. This is closely related to other interesting projects, for instance studying a natural generalization of Viehweg's hyperbolicity conjecture in the same setting. The PI proposes to attack further problems in algebraic dynamics, and in the analytic study of the V-filtration.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2401174","Homological and enumerative intrinsic mirror symmetry and related topics","DMS","ALGEBRA,NUMBER THEORY,AND COM","09/01/2024","08/08/2024","Bernd Siebert","TX","University of Texas at Austin","Continuing Grant","James Matthew Douglass","08/31/2027","$181,612.00","","siebert@math.utexas.edu","110 INNER CAMPUS DR","AUSTIN","TX","787121139","5124716424","MPS","126400","","$0.00","Mirror symmetry is a deep and multi-faceted phenomenon observed by theoretical particle physicists in the context of string theory. It involves connections between different mathematical structures, such as computing volumes versus counting objects. Investigations of the phenomenon have produced a long list of fundamentally new techniques and uncovered unexpected links between hitherto largely unrelated branches of mathematics and physics. Research under this award aims to prove a part of mirror symmetry in the original setup of Calabi-Yau varieties, a distinguished class of spaces in both algebraic and differential geometry. In this setting, pairs of mirror partners have been constructed in rather complete generality, and the remaining task is to prove the two main claimed relations between the geometries: the mentioned enumerative prediction and homological mirror symmetry. This project will provide research training opportunities for students.
Concerning homological mirror symmetry, the goal is to relate the intrinsic mirror ring for a degeneration of Calabi-Yau varieties constructed by the PI and Mark Gross to the symplectic monodromy ring for the degeneration, defined by Floer theory. Another related topic is the generalization of the intrinsic mirror ring to a relative quantum cohomology ring for a pair of a normal crossings divisor in a smooth projective variety. The task for enumerative mirror symmetry is to compute enumerative period integrals from the wall structure appearing in the mirror construction.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2346684","Classical Representation Theory and Categorification","DMS","ALGEBRA,NUMBER THEORY,AND COM","08/15/2024","08/08/2024","Alexander Kleshchev","OR","University of Oregon Eugene","Continuing Grant","James Matthew Douglass","07/31/2027","$68,085.00","","klesh@math.uoregon.edu","1776 E 13TH AVE","EUGENE","OR","974031905","5413465131","MPS","126400","","$0.00","Groups are mathematical objects arising in the study of symmetry. This project is concerned with some of the most important, fundamental and universal examples of groups: symmetric groups arising as symmetries of finite sets, and general linear groups arising as symmetries of vector spaces. Representation theory studies groups via their actions on other mathematical objects, such as vector spaces. Rather informally, representations of a group are snap-shots of the group taken from different angles. In the past several years, the idea of categorification has become important and has led to the development of higher representation theory. This involves studying actions of groups on higher mathematical structures such as categories, analyzing not only the relations between these structures (functors) but also relations between the relations (natural transformations). In particular, quiver Hecke superalgebras encode higher symmetries underlying representation theory of objects including symmetric groups, their double covers, and general linear groups. This project will develop further the theory of these and other superalgebras and apply it to improve our understanding of classical representation theory. The research in this project has potential future impact in theoretical physics and computer science. More directly this project will have an educational impact through the training of graduate students and the mentoring of young researchers in this active area.
In more detail, this project is concerned with a variety of projects in representation theory of Lie algebras, finite groups, and related objects, for example Hecke algebras, quantum groups, Schur algebras and quiver Hecke superalgebras. The PI will draw on recent advances in higher representation theory, with various diagrammatically defined monoidal (super)categories playing a prominent role. On the other hand, most applications are to classical problems in representation theory such as block theory of finite groups and Schur algebras, decomposition numbers, and structure theory of finite groups. The PI will study the local description of blocks of double covers of symmetric groups up to derived equivalence, Turner-Schur (super)algebras and (super)categories and their properties, representations of quiver Hecke superalgebras and their imaginary cuspidal superalgebras, cyclotomic quiver Hecke superalgebras and their RoCK blocks, RoCK blocks of Schur superalgebras, decomposition numbers for RoCK blocks of double covers of symmetric and alternating groups and irreducible reductions modulo p for these double covers, irreducible restrictions from quasi-simple groups to subgroups and subgroup structure of finite groups. The results of the research will have applications to several areas of mathematics including finite group theory (and its applications), Lie theory, combinatorics, representation theory, knot theory and category theory.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2401420","Factorization and degeneration of chiral homology","DMS","ALGEBRA,NUMBER THEORY,AND COM","08/15/2024","08/07/2024","Chiara Damiolini","TX","University of Texas at Austin","Standard Grant","James Matthew Douglass","07/31/2027","$199,927.00","","chiara.damiolini@austin.utexas.edu","110 INNER CAMPUS DR","AUSTIN","TX","787121139","5124716424","MPS","126400","","$0.00","A classical problem, with many applications in the sciences, engineering and the arts, is to determine the symmetries of an object. The branch of mathematics that studies such questions is called representation theory. In this field, symmetries are studied in part by packaging them together as abstract structures with appropriate algebraic properties, such as groups or Lie algebras. Vertex Operator Algebras (VOAs) are a generalization of Lie algebras. VOAs are tightly connected to theoretical physics in what is known as Conformal Field Theory, and, also, to the geometry of surfaces, like spheres or donut-like objects. An important way to study VOAs and their relationship to geometry is via what is known as Chiral Homology. This can be seen as a recipe that takes a VOA and a surface as ingredients and produces a collection of spaces that encode information about the symmetries of the VOA and the complexity of the surface they depend on. However, a variety of fundamental questions about the spaces produced through this recipe are still unresolved. In this project the PI will answer some of these questions. In particular, the PI will describe how Chiral Homology behaves when the surface it depends on is appropriately deformed, and provide a geometric realization of Chiral Homology. The project will also provide research training opportunities for students.
In more technical terms, spaces of conformal blocks associated with projective curves--the algebraic analogue of surfaces--and Lie algebras have been a central object of study in algebraic geometry. In fact, these spaces can be identified with generalized theta functions on the moduli space of principal bundles, and they also define vector bundles on moduli spaces of stable curves. One can consider natural generalizations of these spaces: replacing Lie algebras with VOAs; considering the derived notion of conformal blocks, called Chiral Homology; and allowing the projective curve to admit worse than nodal singularities. The PI and her coauthors have shown that conformal blocks from regular VOAs satisfy factorization and sewing. These properties explicitly control the behavior of conformal blocks under nodal degeneration of the curve they depend on and have been the main tools to explicitly compute the dimensions of these spaces through the Verlinde formula. The In this project, the Pi will show that Chiral Homology from regular VOAs satisfies factorization and sewing. Furthermore, the PI will provide a geometric realization of Chiral Homology and extend this notion to curves with worse singularities.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2401547","Models of Curves, Rational Points, and Modified Diagonal Cycles","DMS","ALGEBRA,NUMBER THEORY,AND COM","09/01/2024","08/07/2024","Padmavathi Srinivasan","MA","Trustees of Boston University","Standard Grant","Adriana Salerno","08/31/2027","$200,000.00","","padmask@bu.edu","1 SILBER WAY","BOSTON","MA","022151703","6173534365","MPS","126400","","$0.00","Number theory has a rich history of long-standing questions that are surprisingly easy to state but notoriously difficult to answer -- for example which sums of perfect powers equal another perfect power (a generalization of the famous Fermat's Last Theorem that remains unanswered). The central paradigm in arithmetic geometry is that the geometry of polynomial equations has a strong bearing on the geography of whole number solutions. In the last 20 years, the quadratic Chabauty method has emerged as a powerful new technique for locating whole number solutions (i.e. rational points) on curves that were impervious to all previous methods. In practice, one makes several simplifying assumptions on the curves in question to use this method. The PI will continue work with collaborators in the area of arithmetic geometry: exploring a new theoretical framework for the quadratic Chabauty method; explicitly computing invariants measuring the complexity of reduction types of curves; and introducing new computational tools to study the Ceresa cycle, a fundamental invariant associated to an algebraic curve with close ties to its geometry and arithmetic. Additionally, the PI will organize events intended to support and showcase the work of junior mathematicians at the institutional, regional, and national levels.
The proposed research will explore three aspects of the arithmetic and geometry of curves. One goal is to explicitly describe good models for solvable covers of the projective line over p-adic fields, and use them to extract various arithmetic invariants of these curves, building on past work by the PI for cyclic covers. Another goal is to build new algorithms for computing various constants appearing in quadratic Chabauty method, using a new framework at bad primes jointly developed with her collaborators, utilizing recent advances in the comparison of p-adic integration theories for curves with bad reduction. The third goal is to use techniques from p-adic integration and the geometry of curves in characteristic p to provide new methods for establishing the nontriviality of the Ceresa cycle, a canonical one dimensional algebraic cycle associated to a curve.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
-"2401587","Polylogarithms, cluster algebras, and hyperbolic geometry","DMS","ALGEBRA,NUMBER THEORY,AND COM, TOPOLOGY","08/15/2024","08/02/2024","Christian Zickert","MD","University of Maryland, College Park","Standard Grant","Tim Hodges","07/31/2027","$210,000.00","","zickert@umd.edu","3112 LEE BUILDING","COLLEGE PARK","MD","207425100","3014056269","MPS","126400, 126700","","$0.00","This award supports research on the interplay between three different research areas: polylogarithms, cluster algebras, and hyperbolic geometry. Polylogarithms generalize the natural logarithm and have been studied since the 18th century. Cluster algebras, invented in the early 21st century, are purely combinatorial objects which are widely studied and broadly applicable. Hyperbolic geometry is a geometry with constant negative curvature, where Euclid's fifth postulate fails. Recent advances have revealed surprising links between these areas. For example, formulas for scattering amplitudes in high energy physics frequently involve polylogarithms evaluated at cluster algebra coordinates. Also, the volume of a certain hyperbolic polyhedron known as an orthoscheme, where successive faces form right angles, is given by a polylogarithm formula. The proposal will investigate key conjectures, find new examples of hyperbolic manifolds, and compute invariants using cluster coordinates. The PI will involve both graduate and undergraduate students in this project and continue his outreach to local schools.
The proposal will explore the relationship between polylogarithms and cluster algebras focusing on several key conjectures in the field. These include the Matveiakin-Rudenko conjecture, that all polylogarithm relations arise from the cluster polylogarithm relations of type A_n; Zagier's polylogarithm conjecture, that the zeta function of a number field at integers is expressed by polylogarithms; and Goncharov's depth conjecture, that a polylogarithm is a classical polylogarithm if an only if its truncated coproduct vanishes. The proposal will explore special cases of these conjectures using Matveiakin and Rudenko's notion of cluster polylogarithms as well as new tools developed by the PI and his collaborators. In addition, the proposal will study Rudenko's polylogarithm formula for a hyperbolic orthoscheme, find new examples of hyperbolic manifolds that don't arise from Coxeter groups (and therefore have dihedral angles that are not a submultiple of pi), and generalize formulas for Cheeger-Chern-Simons invariants from dimension 3 to dimension 5.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2348849","Multiplicities and Valuations in Commutative Algebra","DMS","ALGEBRA,NUMBER THEORY,AND COM","08/15/2024","08/07/2024","Steven Cutkosky","MO","University of Missouri-Columbia","Standard Grant","Tim Hodges","07/31/2027","$209,771.00","","cutkoskys@missouri.edu","121 UNIVERSITY HALL","COLUMBIA","MO","652113020","5738827560","MPS","126400","","$0.00","This award supports a research project on the interaction of algebra and geometry, and the application of commutative algebra to other areas of mathematics. A motivating problem is the existence of resolution of singularities. To resolve singularities is to smooth out, by algebraic operations, the singularities in a space defined by polynomial or analytic equations. Resolution of singularities is still open in certain cases for spaces of dimension greater than 3. It is of importance in other branches of mathematics, physics and engineering. An important focus of the project will be the training of graduate students and the mentoring of young mathematicians from diverse backgrounds.
A major topic to be investigated in this project is properties and applications of filtrations in local rings. Another topic to be investigated is inseparable local uniformization, the local resolution of singularities along a valuation after taking a purely inseparable extension. The PI will also investigate the characterization of good properties of extensions of valuations. The project will explore properties of filtrations of rings, including their analytic spread, Hilbert functions and generalized multiplicities. A particular emphasis will be on divisorial filtrations which although non-Noetherian, share many good properties of the Noetherian I-adic filtrations of an ideal I. Inseparable local uniformization will be investigated. This is a local resolution of singularities, after taking a purely inseparable extension. The PI will also study extensions of valuation rings with the goal of giving valuation theoretic characterizations of when the extensions are almost finite etale, or have related good properties.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2401587","Polylogarithms, cluster algebras, and hyperbolic geometry","DMS","ALGEBRA,NUMBER THEORY,AND COM, TOPOLOGY","08/15/2024","08/02/2024","Christian Zickert","MD","University of Maryland, College Park","Standard Grant","Tim Hodges","07/31/2027","$210,000.00","","zickert@umd.edu","3112 LEE BUILDING","COLLEGE PARK","MD","207425100","3014056269","MPS","126400, 126700","","$0.00","This award supports research on the interplay between three different research areas: polylogarithms, cluster algebras, and hyperbolic geometry. Polylogarithms generalize the natural logarithm and have been studied since the 18th century. Cluster algebras, invented in the early 21st century, are purely combinatorial objects which are widely studied and broadly applicable. Hyperbolic geometry is a geometry with constant negative curvature, where Euclid's fifth postulate fails. Recent advances have revealed surprising links between these areas. For example, formulas for scattering amplitudes in high energy physics frequently involve polylogarithms evaluated at cluster algebra coordinates. Also, the volume of a certain hyperbolic polyhedron known as an orthoscheme, where successive faces form right angles, is given by a polylogarithm formula. The proposal will investigate key conjectures, find new examples of hyperbolic manifolds, and compute invariants using cluster coordinates. The PI will involve both graduate and undergraduate students in this project and continue his outreach to local schools.
The proposal will explore the relationship between polylogarithms and cluster algebras focusing on several key conjectures in the field. These include the Matveiakin-Rudenko conjecture, that all polylogarithm relations arise from the cluster polylogarithm relations of type A_n; Zagier's polylogarithm conjecture, that the zeta function of a number field at integers is expressed by polylogarithms; and Goncharov's depth conjecture, that a polylogarithm is a classical polylogarithm if an only if its truncated coproduct vanishes. The proposal will explore special cases of these conjectures using Matveiakin and Rudenko's notion of cluster polylogarithms as well as new tools developed by the PI and his collaborators. In addition, the proposal will study Rudenko's polylogarithm formula for a hyperbolic orthoscheme, find new examples of hyperbolic manifolds that don't arise from Coxeter groups (and therefore have dihedral angles that are not a submultiple of pi), and generalize formulas for Cheeger-Chern-Simons invariants from dimension 3 to dimension 5.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2348840","Categorification of quasi-split i-quantum groups and related topics in representation theory","DMS","ALGEBRA,NUMBER THEORY,AND COM","08/15/2024","08/02/2024","Jonathan Brundan","OR","University of Oregon Eugene","Standard Grant","James Matthew Douglass","07/31/2027","$210,000.00","","brundan@uoregon.edu","1776 E 13TH AVE","EUGENE","OR","974031905","5413465131","MPS","126400","","$0.00","This is a project in representation theory which, roughly speaking, is the idea of understanding symmetry in the broadest sense by studying the different ways in which symmetries can be realized in terms of matrices. There are many applications, including to number theory, combinatorics, low-dimensional topology, theoretical physics and chemistry. Nearly forty years ago, quantum groups were discovered and shown to possess some remarkably rigid canonical bases. This had a dramatic impact on the study of the classical Lie groups which are the most fundamental symmetries in nature. In fact, quantum groups and their canonical bases are shadows of some even more remarkable higher structures, Kac-Moody 2-categories, which are often referred to as the categorifications of quantum groups. In classical mathematics, Lie groups go hand in hand with the symmetric spaces on which they act. Symmetric spaces admit quantizations, namely the i-quantum groups appearing in the title of the project, which were first introduced in 1998 and rapidly developed into a rich theory. This project will also provide research training activities for graduate students.
The main goal of this project is to take the next step by categorifying all quasi-split i-quantum groups, building on the recent discovery by the PI and collaborators of a 2-category which categorifies the simplest rank one i-quantum group. This theory, which although algebraic in nature, has many connections to the geometry of the underlying Lie groups via the theory of singular Soergel bimodules. In addition, the PI will study more classical topics in representation theory by applying the deeper understanding of quantum and i-quantum groups that appear as hidden symmetries in more classical settings.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2401311","Zero-free regions for L-functions and related problems","DMS","ALGEBRA,NUMBER THEORY,AND COM","08/01/2024","07/31/2024","Jesse Thorner","IL","University of Illinois at Urbana-Champaign","Standard Grant","Andrew Pollington","07/31/2027","$194,933.00","","jesse.thorner@gmail.com","506 S WRIGHT ST","URBANA","IL","618013620","2173332187","MPS","126400","","$0.00","This award is for research in the theory of numbers. Every positive whole number is uniquely expressible as a product of primes. Primes are fascinating to study theoretically, but they also feature prominently in cryptography (the secure transmission of information). The distribution of primes is analytically encoded in the Riemann zeta function, the simplest example of an L-function. L-functions are ubiquitous in modern number theory. Many widely studied number-theoretic problems are naturally phrased in terms of properties of more general L-functions. This project will focus on non-vanishing of L-functions, individually and in parametric families. This is one of the most important questions regarding L-functions. For example, the distribution of zeros of the Riemann zeta function influences the distribution of primes (the subject of the Riemann Hypothesis), and conjecturally, the Hasse-Weil L-function of an elliptic curve vanishes at the point s = 1/2 if and only if the elliptic curve has infinitely many rational points (the Birch and Swinnerton-Dyer conjecture). The project includes training of undergraduate and graduate students.
This project has three components. Towards the first component, the PI aims to develop new techniques to establish strong t-aspect zero-free regions for all Rankin-Selberg L-functions. The goal is a t-aspect zero-free region as strong as what de la Vallée Poussin established for the Riemann zeta function. Towards the second component, the PI aims to find new large classes of Rankin-Selberg L-functions for which one can establish a ?hybrid-aspect? zero-free region with good t-dependence and no Landau-Siegel zero. This new uniformity will improve our understanding of the distribution of primes in relation to joint Sato-Tate laws involving two non-CM twist-inequivalent modular elliptic curves over a totally real number field. Towards the third component, the PI will continue earlier work on zero density estimates, showing that all L-functions in a family apart from a small exceptional set have very strong zero-free regions.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2435243","Analytic Number Theory over Function Fields","DMS","ALGEBRA,NUMBER THEORY,AND COM","02/15/2024","06/12/2024","Will Sawin","NJ","Princeton University","Continuing Grant","Andrew Pollington","06/30/2024","$24,864.00","","sawin@math.columbia.edu","1 NASSAU HALL","PRINCETON","NJ","085442001","6092583090","MPS","126400","","$0.00","Number theory is an area of mathematics that centers on the ordinary counting numbers and their behavior when we add and multiply them. While problems in this area are often simple to state, they can be fiendishly difficult to solve. The subfield of function field number theory aims to obtain insight on these problems by considering a kind of model or parallel universe where numbers behave differently. We consider what happens when we add or multiply numbers as normal but, except, instead of carrying digits, we simply drop the excess. Certainly arithmetic is a little easier with this modified rule, but more surprisingly, some of the most important problems in number theory become easier as well, with even some of the most difficult ones becoming solvable. (Technically, we should work in binary, or any prime base, rather than our usual base 10, for this.) Alternately, we can describe this variant arithmetic as the addition or multiplication of polynomial functions in a single variable. In this setting, we can connect number-theoretic questions to geometry, by viewing the graph of the polynomial as a geometric object. In this award the PI's research uses geometric tools to solve new problems in this area.
The PI's research has resolved function field analogues of classical problems in number theory, including the twin primes conjecture and Chowla's conjecture (both joint with Shusterman), cases of the Ramanujan conjecture (joint with Templier), and conjectures about moments of L-functions. In this award the PI will continue along these lines, proving additional results about the distribution of prime numbers, L-function moments, and automorphic forms, and work in further directions such as non-abelian Cohen-Lenstra heuristics. These works are all based on etale cohomology theory, where the foundational result, Deligne's Riemann Hypothesis, allows many different analytic problems (problems about proving some inequality) to be reduced to cohomology problems (problems about calculating some of the cohomology groups of a variety or sheaf). The relevant varieties are high-dimensional, and calculating the necessary cohomology groups requires techniques like vanishing cycles theory and the characteristic cycle.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
diff --git a/Applied-Mathematics/Awards-Applied-Mathematics-2024.csv b/Applied-Mathematics/Awards-Applied-Mathematics-2024.csv
index 6e11f1b..373f2e0 100644
--- a/Applied-Mathematics/Awards-Applied-Mathematics-2024.csv
+++ b/Applied-Mathematics/Awards-Applied-Mathematics-2024.csv
@@ -4,10 +4,12 @@
"2407006","Modeling and Analysis of Particle Laden Flow","DMS","APPLIED MATHEMATICS","09/01/2024","08/07/2024","Andrea Bertozzi","CA","University of California-Los Angeles","Continuing Grant","Pedro Embid","08/31/2029","$313,202.00","","bertozzi@math.ucla.edu","10889 WILSHIRE BLVD STE 700","LOS ANGELES","CA","900244200","3107940102","MPS","126600","","$0.00","This project involves modeling and analysis of particle laden flows using nonlinear partial differential equations (PDEs) for the flow thickness and volume fraction of particles in the flow. Such flows arise in many applications including the food industry, mining, and environmental cleanup. These flows are notoriously difficult to model because the dominant physics, especially for viscous flows, is due to many body interactions of the particles. There are no ""first principles"" continuum models for the physics and instead modelers rely on semi-empirical rules for particle settling and migration. Even at the elementary level of reduced order continuum theory, the mathematical equations are a system of conservation laws with fluxes that need to be estimated numerically. This project addresses fundamental mathematics problems related to these models. The project also develops new models for flows in complex geometries such as spiral separators used in the mining industry. This project is a five-year study that impacts our understanding of particle laden flow dynamics and analysis of PDEs for the novel fluid equations that model the physics of particle laden flows. In addition, this project provides research training for two doctoral students, five undergraduate researchers, and two postdoctoral scholars over a five-year period.
This project addresses several interrelated problems in particle laden flow models. (a) Flux functions in conservation law models for particle laden flow must be computed or estimated numerically. This raises the question of structural stability of multi-wave solutions of conservation laws under perturbation of the flux function. (b) Singular shocks have been shown to exist in conservation laws that model particle laden flow. Such solutions have largely, to date, been a curiosity in the mathematics literature. This project considers the actual physics that leads to singular shocks and studies how to continue those solutions after the singular shock formation in a way that is consistent with experimental observations. (c) This project considers models for bidisperse flows with direct comparison to experiments, building on earlier work for bidensity flows. (d) Spiral Separators are devices used in the mining industry in which slurries flow under gravity in a helical trough and species within the slurry naturally separate through turns of the spiral, coming out as stripes at the end. This project develops an asymptotic model for two species flows in spiral separators and studies how to optimally separate the species.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2406240","Applications of infinite dimensional PDEs to transport, finance, and games","DMS","APPLIED MATHEMATICS","08/15/2024","08/07/2024","Ibrahim Ekren","MI","Regents of the University of Michigan - Ann Arbor","Standard Grant","Pedro Embid","07/31/2027","$268,602.00","","iekren@umich.edu","1109 GEDDES AVE, SUITE 3300","ANN ARBOR","MI","481091079","7347636438","MPS","126600","","$0.00","This interdisciplinary research project aims to develop new mathematical tools based on partial differential equations (PDEs) and apply them to address practical challenges in mathematical finance and game theory. PDEs serve as fundamental mathematical models for understanding complex phenomena in these fields, ranging from optimization strategies in uncertain markets to analyzing strategic interactions in competitive environments. The unique aspect of this research lies in its focus on PDEs formulated in infinite-dimensional spaces, which present both theoretical challenges and opportunities for developing innovative solutions tailored to real-world financial and strategic decision-making scenarios. This award will also provide opportunities for students to be involved in the research projects.
In the first part of the project, infinite-dimensionality arises because one of the state variables is in the Wasserstein space of probability measures. Some of the equations to be studied are motivated by the large population limit of the equilibrium among N weakly interacting symmetric agents. The primary goal is to use these partial differential equations to demonstrate sharp convergence rates to the large population limit. A second class of equations in the Wasserstein space is inspired by causal optimal transport problems and the control of the solution of Kushner's equation for optimal filtering. While these transport problems were initially introduced to measure distances between the laws of stochastic processes, the project aims to establish a novel connection between these transport problems and issues of information asymmetry in finance. In the second part of the proposal, infinite-dimensionality arises from path dependence. The objective is to establish an Ishii's lemma applicable in this context and utilize this result to achieve well-posedness for two classes of second-order parabolic partial differential equations in the space of paths. The proposed research is expected to yield a comparison result for semicontinuous viscosity solutions of these PDEs, which directly impacts the convergence of numerical schemes for the hedging problem with rough volatility. In terms of applications, the infinite-dimensionality in this context stems from stochastic Volterra integral equations, time-inconsistent optimal control problems, and functional Itô calculus.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2405535","Nonlinear Waves in Discrete Heterogeneous Media","DMS","APPLIED MATHEMATICS","08/15/2024","08/07/2024","Timothy Faver","GA","Kennesaw State University Research and Service Foundation","Standard Grant","Stacey Levine","07/31/2027","$134,001.00","","tfaver1@kennesaw.edu","1000 CHASTAIN RD NW","KENNESAW","GA","301445588","4705786381","MPS","126600","","$0.00","Many naturally occurring and artificially induced phenomena can be studied via models of discrete, heterogeneous media. Examples include the oscillations of DNA strands and muscle proteins, the resonances within granular crystals, and the flow of growth hormones among the cells of plant leaves. These examples are discrete in the sense that they can be separated into multiple components that remain distinct from each other but also interact together. They are heterogeneous because the material properties of their components are not all identical but instead vary in some identifiable manner, perhaps via intentional structural choices or due to flaws and defects. Both the discrete structure of a medium and the heterogeneous selection of its material data profoundly influence the kinds of dynamics that it can, or cannot, experience. This project will develop and refine the mathematical theory of waves in such media by studying increasingly more complicated dimensions of heterogeneity, by developing more precise and versatile quantifications of existing wave structures, and by identifying new kinds of dynamics that can propagate in the presence of physical discretization and material heterogeneity. The project will also provide topics coursework in nonlinear waves for advanced undergraduates and foster community engagement in department seminars and colloquia.
The project will study the effects of material heterogeneity in lattice differential equations. These arise after a traveling wave ansatz from the equations of motion for infinite chains of nonlinearly coupled discrete oscillators. Varying the oscillator masses and/or the coupling potentials yields material heterogeneities. Particularly meaningful structures result from allowing the material data to repeat periodically throughout the lattice. Recent work has shown that the classical solitary wave in monatomic lattices of identical particles and springs perturbs into a nanopteron or generalized solitary wave in dimer lattices of alternating particles or springs. The nanopteron traveling wave profile consists of small amplitude, highly oscillatory far field periodic ripples superimposed on an exponentially localized core. The project will adapt the construction of nanopterons to lattices whose material data repeats with a larger period than the dimer and whose structure destroys the useful physical symmetries of dimers, two significant and unresolved complications. One likely change is that the periodic ripple will need to be replaced by a quasiperiodic structure. Additionally, the project will revisit dimer nanopterons and extract more precise estimates on the amplitudes and phase shifts of the periodic ripples, as well as develop simpler model equations that transparently capture the functional-analytic complexities of these problems. Throughout, the main technical challenges will be the nonlocal structure of the traveling wave problems, a consequence of the lattice's discrete structure; singular perturbations arising from various physical limiting regimes of interest; multiple solvability conditions induced by the higher-order heterogeneities; and exponentially small, or possibly large, factors, that arise in capturing the precise ripple amplitudes.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2408408","Collaborative Research: NSF-UEFISCDI: RUI: Effects of the Interplay Between Connectivity Architecture and Distributed Delays in Brain Network Dynamics","DMS","APPLIED MATHEMATICS","10/01/2024","08/08/2024","Thomas Covey","NY","SUNY at Buffalo","Standard Grant","Stacey Levine","09/30/2027","$32,736.00","","tjcovey@buffalo.edu","520 LEE ENTRANCE STE 211","AMHERST","NY","142282577","7166452634","MPS","126600","","$0.00","Over the past two decades researchers have made significant progress in developing mathematical models and tools that are compatible with understanding the complexity of the human brain and similarly complex systems. These tools have been used to investigate how complex neural interactions underlie dynamic patterns found in processes like learning, memory formation and cognition. However, many questions on both healthy and pathological brain function remain intractable by existing mathematical approaches. That is because the system's components interact both spatially and temporally. Hence, in order to model and understand differences between healthy and pathological function in a neural circuit, one needs to simultaneously keep track of connectivity architecture in a massive network, and of its past activity. This poses a significant challenge for both analytical and computational approaches. This project aims to establish and use a tractable quantitative framework that considers both of these aspects, by employing networks of coupled equations that include time delays to capture how recent interactions between the elements of the system influence future interactions. A traditional model of neural population dynamics will be used as the building block for larger functional brain circuits, while additionally incorporating different types of time-delays. This will enable a well-studied framework to be embedded into a new mathematical environment that jointly considers the system's architecture and history. Preliminary joint work (on toy network models with selected types of time delays) has established in principle that this approach is computationally tractable, and that it can be used to contextualize transitions between healthy brain function and pathological patterns (such as those found in Parkinson's disease and emotional disorders). The project will support a vertically integrated team including a postdoctoral fellow, a co-advised Ph.D. student and five undergraduate students at a predominantly undergraduate institution, recruited from underrepresented groups. The collaboration capitalizes on the team's combined expertise in network science, delay differential equations and brain imaging techniques.
The team will combine new network techniques with novel approaches to distributed time delays. The theoretical methods will be integrated with human brain data for potential clinical applications, via the collaboration with the Neurology Department at University at Buffalo. The approach will encompass three aspects that will develop simultaneously and support each other. (1) General networks under minimal assumptions on network architecture and shape of the delay kernels will be developed. (2) Numerical simulations will be used to demonstrate dynamic behaviors in specific classes of complex networks and for structured or stochastic distributions of delay kernels across the network nodes. (3) The new mathematical framework and numerical algorithms will be used to investigate how timing impacts information propagation in neural circuits that govern specific behaviors, in both computational models and in empirical data. The methods developed in this project will thus help us understand physiological mechanisms behind imaging and behavioral observations and help identify the underpinnings of pathological behaviors in neurological and psychiatric illnesses.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2406905","Towards Geometry-Informed Machine Learning: A Comprehensive Framework for Recognizing and Leveraging Heterogeneous Geometric Structure in Data","DMS","APPLIED MATHEMATICS","08/15/2024","08/07/2024","Melanie Weber","MA","Harvard University","Continuing Grant","Stacey Levine","07/31/2027","$120,000.00","","mweber@seas.harvard.edu","1033 MASSACHUSETTS AVE STE 3","CAMBRIDGE","MA","021385366","6174955501","MPS","126600","075Z, 079Z","$0.00","This project supports the development of more efficient and sustainable machine learning methods using inherent structure in the data. Structured data arises in many scientific and industrial applications, including relational structure in complex social and biological systems, hierarchical structure in information and language systems, as well as symmetries in scientific data that derive from fundamental laws of physics. The project aims to develop methods for identifying, characterizing, and leveraging such structure in machine learning and data science applications. Research findings will be incorporated into graduate courses and graduate and undergraduate students from potentially diverse backgrounds will be mentored as part of this project, contributing to the training of the next generation of applied mathematicians. In addition, ideas and concepts with direct relation to the proposed research will be incorporated into STEM outreach activities with the goal of sharing the research findings with the broader community.
The project aims to develop a novel computational framework for leveraging geometric structure in data that is applicable to settings without pre-existing knowledge on data geometry. Geometric representation learning will be formalized as a model selection problem, where the respective geometric characteristics are learned from data. The project?s results will contribute to the field by providing a systematic analysis of the benefits of geometric machine learning methods compared to classical Euclidean approaches. With that, the project aims to develop a deeper theoretical understanding of geometric machine learning and offer practical, empirically validated guidelines for the application of such methods.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2406232","Extreme Value Theory for diffusive particle systems with mean-field interaction","DMS","APPLIED MATHEMATICS","08/15/2024","08/07/2024","Nikolaos Kolliopoulos","MI","Regents of the University of Michigan - Ann Arbor","Continuing Grant","Pedro Embid","07/31/2027","$106,291.00","","nkolliop@umich.edu","1109 GEDDES AVE, SUITE 3300","ANN ARBOR","MI","481091079","7347636438","MPS","126600","1303, 5294","$0.00","Classical Extreme Value Theory studies the behavior of the largest values in large collections of numerical observations of some random phenomenon, for example, the intensities of earthquakes that occurred during a certain period at a certain region on the Richter scale. Existing results allow for understanding the likelihood of observing extremely large values in the future, larger than any past observation, and thus they can help in the prediction, for example, of extreme natural phenomena. This project aims to extend this theory to sets of interdependent numerical values, with a certain kind of interaction that is present in models which are widely used in Finance, Medicine, and other areas. Models of this type are used to describe: (a) quantities like the default likelihoods of competing financial institutions, where an extreme value analysis can help in the estimation of the probability of a credit event in the future; (b) the market capitalizations of all the companies in some large market, where an understanding of the extreme order statistics can be helpful in the improvement of existing models in stochastic portfolio theory; (c) The electrical potentials in human neurons, where extreme values might be associated with certain diseases. This award will also provide opportunities for students to be involved in this research.
The aim of this project is to study the convergence of the appropriately normalized upper and intermediate order statistics of certain systems of SDEs as their size grows towards infinity. The equations interact through the dependence of the coefficients either directly on the systemic empirical measure, or on control processes that are picked to minimize some costs which are functions of the empirical measure. The first step is the reduction to the case where the SDEs are independent through the establishment of propagation of chaos, while the second step involves the treatment of this simple case using techniques from classical Extreme Value Theory and Malliavin Calculus. As statistical estimators for the parameters of the limiting distributions are functions of intermediate order statistics, properties like estimator consistency will also be extended to the case of interacting diffusions, allowing for the estimation of extreme value parameters associated with observed populations. Then, the probability of observing very large values in the future can be estimated by performing a time series analysis on the estimated parameters.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2405348","Theoretical and algorithmic foundations of novel medical imaging modalities","DMS","APPLIED MATHEMATICS","08/15/2024","08/07/2024","Leonid Kunyansky","AZ","University of Arizona","Continuing Grant","Stacey Levine","07/31/2027","$183,980.00","","leonk@math.arizona.edu","845 N PARK AVE RM 538","TUCSON","AZ","85721","5206266000","MPS","126600","079Z","$0.00","Since the invention of computer-aided tomography, numerous imaging modalities have been introduced and have demonstrated increasing value as diagnostic instruments in biology and medicine. These include Thermoacoustic and Photoacoustic Tomography (TAT and PAT), Magnetoacoustoelectric Tomography (MAET), and UltraSound Current Density Imaging (USCDI), which combine the high resolution of ultrasound with the sensitivity of electromagnetic waves to optical absorption and conductivity of the tissues, or to biological currents in the heart or in the brain. Sharp abnormalities in the latter physical parameters are good markers of breast cancer, thrombosis, ischemia, epilepsy and other medical conditions. These new techniques overcome limitations of classical tomography, and deliver otherwise unavailable, potentially life-saving diagnostic information ? at a lesser cost and with less harm to a patient. The images in these modalities are obtained by complex mathematical procedures, rather than through direct acquisition. The underlying mathematics and the image reconstruction algorithms required by these methods are at very early stages of development. The investigator, in collaboration with domain scientists and medical engineers will work to resolve some of the central theoretical challenges and develop efficient numerical techniques for PAT, TAT, MAET and UCSDI. A graduate student will play a significant role in the project, gaining exposure to practical and theoretical skills lying at the junction of exact sciences, medicine, and biology. The results of this project will be disseminated through publications in high quality research journals, presentations at national and international conferences, and a series of lectures at various major venues. The algorithms and the experimental data will be made publicly available.
The mathematics underlying and enabling such modalities as PAT, TAT, MAET and UCSDI scattering, contains a number of challenging open questions, important from both the theoretical and applied points of view. The investigator, with his graduate student and colleagues at several universities, aims to gain theoretical understanding and to develop efficient image reconstruction algorithms for these modalities. In particular, they plan to (1) derive range conditions for the wave operator arising in TAT and PAT, defined on a spherical domain with a reduced measurement interval, (2) develop fast algorithms enabling deep learning techniques in data-intensive inverse problems of TAT and PAT (3) and on deriving, justify and explore an accurate model of ultrasound current detection in acoustically perturbed medium.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2407615","Singularities and Long Time Dynamics in Models of Fluids and Reactive Processes","DMS","APPLIED MATHEMATICS","08/15/2024","08/06/2024","Andrej Zlatos","CA","University of California-San Diego","Standard Grant","Pedro Embid","07/31/2027","$240,000.00","","azlatos@ucsd.edu","9500 GILMAN DR","LA JOLLA","CA","920930021","8585344896","MPS","126600","","$0.00","This project is composed of two parts. The main objective is the study of mathematical models of fluid motions, and more specifically creation of small-scale structures and singularities in fluids. This is a question of great importance in mathematics as well as in physics and engineering, as it is related to fluid turbulence and also explores how well the theoretical models describe real world phenomena in extreme situations. The project will focus on singularity creation for motions of fluids in porous media (e.g. underground aquifers), in atmospheric science models, as well as for dynamics of fluids near walls and other boundaries. The second objective of the project is the study of propagation of reactive processes (e.g. forest fires) through combustive media. While the dynamics of such a process may intricately depend on small scale variations in the environment, the goal of this part of the project is to demonstrate that in many situations averaging of these variations over large regions results in a more regular and predictable large scale and long-term behavior of the process. This part will also involve the study of propagation of bacterial colonies through nutrient-rich environments, and the enhancement of its speed due to the phenomenon of bacterial chemotaxis. The proposal will provide opportunities for the involvement of students and junior researchers in the research projects.
The primary focus of this project is the study of singularities and singular solutions for several nonlinear partial differential equations (PDEs) that serve as models of incompressible fluid dynamics. This includes motion of fluids in porous media on domains with boundaries, such as aquifers sitting on top of impermeable rocky layers; atmospheric science models such as generalized surface quasi-geostrophic (gSQG) equations; as well as Euler equations, modeling motions of ideal fluids, on planar domains with irregular boundaries. In some of these models the relevant local well-posedness theories have not been found yet, so their development will also be an integral part of the project. A secondary focus of the project is a better understanding of large-scale behavior of reactive processes spreading through heterogeneous media, specifically development of a homogenization theory for the nonlinear reaction-diffusion PDE that models such processes occurring in multi-dimensional random media. The goal is to show that under fairly general hypotheses, large scale behavior of solutions to this model is governed by much simpler homogenized PDE that capture the effects of the random variations in the medium averaging out in the long term. In addition, effects of chemotaxis on the speed of propagation of bacterial colonies through nutrient-rich environments will also be studied.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2408407","Collaborative Research: NSF-UEFISCDI: RUI: Effects of the Interplay Between Connectivity Architecture and Distributed Delays in Brain Network Dynamics","DMS","APPLIED MATHEMATICS","10/01/2024","08/08/2024","Anca Radulescu","NY","SUNY College at New Paltz","Standard Grant","Stacey Levine","09/30/2027","$266,538.00","","radulesa@newpaltz.edu","75 S MANHEIM BLVD","NEW PALTZ","NY","125612407","8452573282","MPS","126600","","$0.00","Over the past two decades researchers have made significant progress in developing mathematical models and tools that are compatible with understanding the complexity of the human brain and similarly complex systems. These tools have been used to investigate how complex neural interactions underlie dynamic patterns found in processes like learning, memory formation and cognition. However, many questions on both healthy and pathological brain function remain intractable by existing mathematical approaches. That is because the system's components interact both spatially and temporally. Hence, in order to model and understand differences between healthy and pathological function in a neural circuit, one needs to simultaneously keep track of connectivity architecture in a massive network, and of its past activity. This poses a significant challenge for both analytical and computational approaches. This project aims to establish and use a tractable quantitative framework that considers both of these aspects, by employing networks of coupled equations that include time delays to capture how recent interactions between the elements of the system influence future interactions. A traditional model of neural population dynamics will be used as the building block for larger functional brain circuits, while additionally incorporating different types of time-delays. This will enable a well-studied framework to be embedded into a new mathematical environment that jointly considers the system's architecture and history. Preliminary joint work (on toy network models with selected types of time delays) has established in principle that this approach is computationally tractable, and that it can be used to contextualize transitions between healthy brain function and pathological patterns (such as those found in Parkinson's disease and emotional disorders). The project will support a vertically integrated team including a postdoctoral fellow, a co-advised Ph.D. student and five undergraduate students at a predominantly undergraduate institution, recruited from underrepresented groups. The collaboration capitalizes on the team's combined expertise in network science, delay differential equations and brain imaging techniques.
The team will combine new network techniques with novel approaches to distributed time delays. The theoretical methods will be integrated with human brain data for potential clinical applications, via the collaboration with the Neurology Department at University at Buffalo. The approach will encompass three aspects that will develop simultaneously and support each other. (1) General networks under minimal assumptions on network architecture and shape of the delay kernels will be developed. (2) Numerical simulations will be used to demonstrate dynamic behaviors in specific classes of complex networks and for structured or stochastic distributions of delay kernels across the network nodes. (3) The new mathematical framework and numerical algorithms will be used to investigate how timing impacts information propagation in neural circuits that govern specific behaviors, in both computational models and in empirical data. The methods developed in this project will thus help us understand physiological mechanisms behind imaging and behavioral observations and help identify the underpinnings of pathological behaviors in neurological and psychiatric illnesses.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2406447","Inverse Boundary Value Problems","DMS","APPLIED MATHEMATICS","08/15/2024","08/07/2024","Gunther Uhlmann","WA","University of Washington","Standard Grant","Stacey Levine","07/31/2027","$300,000.00","","gunther@math.washington.edu","4333 BROOKLYN AVE NE","SEATTLE","WA","981951016","2065434043","MPS","126600","","$0.00","The ability to determine the internal properties of a medium by making measurements at the boundary of the medium provides important insight in a wide range of scientific applications. The question is whether one can one ""see"" what is inside the medium by making measurements on the outside. This project involves establishing a deeper mathematical understanding of the inverse imaging technique called electrical impedance tomography (EIT), which arises both in medical imaging and geophysics. EIT attempts to determine the electrical properties of an object by making voltage and current measurements from electrodes located at the boundary of the object. This project will also investigate the question of determining the inner structure of the Earth by measuring the travel times of earthquakes measured at different seismic stations located throughout the Earth. Graduate students will be trained and contribute to these projects.
This project will address the mathematical theory of several fundamental inverse problems arising in many areas of science and technology including medical imaging, geophysics, astrophysics and nondestructive testing, to name a few. Three topics of research will be addressed. The first one is Electrical Impedance Tomography (EIT), also called Calderon?s problem. The second topic is travel time tomography in anisotropic media. The third topic is inverse problems for non-linear hyperbolic equations. EIT is an inverse method used to determine the conductivity of a medium by making voltage and current measurements at the boundary. Specific projects will address mathematical challenges in developing and understanding the frameworks that address the case of partial data, anisotropic conductors, the recovery of discontinuities of a medium from boundary information, quasilinear model equations, and high frequencies for anisotropic media. An understanding of travel time tomography involves the determination of a Riemannian metric (anisotropic sound speed) in the interior of a domain from the lengths of geodesics joining points of the boundary (travel times) and from other kinematic information. This project will address the two dimensional scenario, the range characterization and boundary rigidity for simple manifolds, and a novel metric from the area of minimal surfaces bounded by closed curves on the boundary. The investigator will also develop a framework for using the interaction of waves to create new waves that will give information about the object being probed. Specific topics include the study of an inverse problem for the non-linear Klein Gordon equation and inverse problems arising in fluid dynamics.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2406283","Aviles-Giga Conjecture, Differential Inclusions and Rigidity","DMS","APPLIED MATHEMATICS","08/15/2024","08/07/2024","Andrew Lorent","OH","University of Cincinnati Main Campus","Standard Grant","Dmitry Golovaty","07/31/2027","$149,999.00","","andrew.lorent@uc.edu","2600 CLIFTON AVE","CINCINNATI","OH","452202872","5135564358","MPS","126600","","$0.00","The project aims to advance understanding of some key problems in the field of Calculus of Variations, specifically the Aviles-Giga conjecture, and more broadly, how restrictions on gradients of functions imply rigidity, stability, and compactness properties. The Aviles-Giga conjecture is a central open problem in the Calculus of Variations, modeling phenomena such as thin film blistering and micromagnetics. The conjecture seeks to provide a mathematical justification for a scaling law observed in physics, leading to more accurate modeling of certain physical phenomena. Part of the conjecture involves sharp regularity estimates for a well-studied class of equations known as Eikonal equations, which arise in liquid crystal models and optics. These estimates are valuable for numerically solving such equations and are of broad mathematical interest. The Aviles-Giga theory is closely connected to the theory of scalar conservation laws, and its methods are being applied to understand a class of solutions of scalar conservation laws that arise in probability, specifically the large deviation conjecture. The project also aims to propagate its outcomes through seminars, lectures, graduate student recruitment, and the research produced.
The project consider problems in Calculus of Variations. The first problem is the Aviles-Giga conjecture, where the main open problem is showing that the energy concentrates, as it is not even known if the measure representing the limiting energy is singular. Achieving this goal would lead to a complete understanding of the regularizing properties of the Eikonal equation on the Besov scale. The second problem deals with quantitative rigidity for non-elliptic differential inclusions and builds on a previous result for rotation matrices and an optimal generalization to connected 1D elliptic curves in the space of two-by-two matrices. One of the purpose of this work is a more general regularity/rigidity theory for non-elliptic curves. The third project studies compensated compactness and conservation laws in higher dimensions. Reformulating regularity and uniqueness questions of PDEs as differential inclusions has led to the solution of a number of outstanding conjectures. This part of the research focuses on further developing methods initiated by the principal investigator and collaborators to study the differential inclusion problem related to regularity and uniqueness questions for conservation laws in higher dimensions. The final project on gamma-convergence for the Bellettini-Bertini-Mariani-Novaga functional considers a proposed gamma-limit related to certain conjectures in large deviation theory. The project focuses on a special case of this conjecture.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2408585","Analysis of Continuum PDE's in Collective Behavior and Related Models","DMS","APPLIED MATHEMATICS","08/15/2024","08/05/2024","Trevor Leslie","IL","Illinois Institute of Technology","Standard Grant","Hailiang Liu","07/31/2027","$150,000.00","","tleslie@iit.edu","10 W 35TH ST","CHICAGO","IL","606163717","3125673035","MPS","126600","","$0.00","This project concerns the fundamental mechanisms underpinning collective behavior of large groups of agents, such as flocks of birds, schools of fish, or swarms of bacteria. Mathematical models for these phenomena offer insights into how large-scale structures emerge from small-scale interactions in physical systems, with potential applications in technology, including in computer graphics. In order to efficiently study systems with an otherwise intractable number of agents, this project will focus on the ""effective"" large-scale dynamics rather than on individual trajectories. Taking this perspective brings the problems of interest into the realm of partial differential equations. The models that arise in these problems bear substantial resemblance to equations found in fluid dynamics and continuum mechanics, a connection that will be leveraged extensively in the research to be carried out. The mentorship, training, and professional development of students and junior researchers will also be a key goal of the project.
The proposed analysis will center on the effects of a nonlocal velocity alignment mechanism in isolation, as manifested in the class of hydrodynamic equations known as Euler Alignment systems. The PI will investigate the consequences of imposing different communication rules, especially as they relate to the large-time structure and regularity of the density profile. Emphasis will be placed on the as-of-yet poorly understood transition between qualitatively different regimes of interactions. In particular, the PI will leverage the additional structure available in settings with simple geometries to draw connections between models that incorporate strongly localized alignment and those that feature sticky particles. The PDEs governing alignment dynamics serve as a paradigm for more general nonlocal equations, and the proposed research has the potential to advance the understanding of classes of nonlocal models far beyond those explicitly studied in the project.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
diff --git a/Computational-Mathematics/Awards-Computational-Mathematics-2024.csv b/Computational-Mathematics/Awards-Computational-Mathematics-2024.csv
index 2cff070..1f5ff43 100644
--- a/Computational-Mathematics/Awards-Computational-Mathematics-2024.csv
+++ b/Computational-Mathematics/Awards-Computational-Mathematics-2024.csv
@@ -1,16 +1,17 @@
"AwardNumber","Title","NSFOrganization","Program(s)","StartDate","LastAmendmentDate","PrincipalInvestigator","State","Organization","AwardInstrument","ProgramManager","EndDate","AwardedAmountToDate","Co-PIName(s)","PIEmailAddress","OrganizationStreet","OrganizationCity","OrganizationState","OrganizationZip","OrganizationPhone","NSFDirectorate","ProgramElementCode(s)","ProgramReferenceCode(s)","ARRAAmount","Abstract"
+"2410538","Multiscale Discontinuous Galerkin Methods for Kinetic Models of Gas and Plasma","DMS","OFFICE OF MULTIDISCIPLINARY AC, COMPUTATIONAL MATHEMATICS","09/01/2024","08/08/2024","James Rossmanith","IA","Iowa State University","Standard Grant","Ludmil T. Zikatanov","08/31/2027","$167,000.00","","rossmani@iastate.edu","1350 BEARDSHEAR HALL","AMES","IA","500112103","5152945225","MPS","125300, 127100","9150, 9263","$0.00","Variants of the celebrated Boltzmann equation can be used to model the dynamics of rarefied gases (i.e., collections of molecules that move around in space and interact through collisions), as well as plasma (i.e., collections of positively and negatively charged ions that move around in space and interact through collisions and electromagnetic forces). As such, solutions of the Boltzmann equation can be used to describe and predict the dynamics in various applications, such as flow in microfluidic devices, hypersonic and space vehicle aerodynamics, flow in magnetically confined fusion reactors, and particle acceleration in laser-plasma systems. A critical challenge is that computing solutions to the Boltzmann equation in realistic scenarios is prohibitively expensive, even on modern massively parallel computers. An important goal of this research is to develop reduced-order models that can capture important flow features but that can be more readily solved on modern computer architectures. The approach pursued in this research is to decompose the solution into a macroscopic portion that describes large-scale features and a microscopic portion that describes smaller-scale features; macroscopic features can be computed relatively cheaply and accurately, while microscopic features are expensive to compute. Various adaptive strategies are explored to reduce the expense of the microscopic portions.
The primary objective of this research is to develop accurate and efficient computational methods for solving the kinetic Boltzmann and Vlasov equation for modeling rarefied gases and plasma. The main challenge in solving kinetic models is that solutions live in high-dimensional phase space and contain information over wide-ranging spatial and temporal scales. An important goal is to develop reduced models that capture the important physics and can be more readily solved on modern computer architectures. The approach pursued in this research is based on decomposing the kinetic particle density function into macroscopic and microscopic pieces, allowing for different computational techniques on each portion. This research leverages several key innovations, including (1) high-order discontinuous Galerkin finite element methods for spatial discretization, (2) novel explicit and semi-implicit time-stepping techniques, (3) adaptive refinement strategies to reduce the computational expense of the microscopic portion of the update, and (4) implementation of the resulting algorithms on massively parallel computer architectures. Verification and validation will be performed on several test cases relevant to the simulation of rarefied gases and plasma.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2408877","Structure Preserving Optimization Algorithms and Digital Twins","DMS","OFFICE OF MULTIDISCIPLINARY AC, COMPUTATIONAL MATHEMATICS","08/01/2024","07/31/2024","Harbir Antil","VA","George Mason University","Standard Grant","Troy D. Butler","07/31/2027","$275,000.00","","hantil@gmu.edu","4400 UNIVERSITY DR","FAIRFAX","VA","220304422","7039932295","MPS","125300, 127100","075Z, 079Z, 9263","$0.00","Optimization problems constrained by physics are ubiquitous. These problems are nonlinear, nonsmooth and contain unknown parameters. The physics describing the constraints are partial differential equations (PDEs) which are multiscale, multiphysics, and geometric in nature. They capture many realistic scenarios: control of pathogen propagation like COVID-19, blood flow in aneurysms, determining weakness in structures, and vortex control in nuclear reactors. This project will study optimization problems constrained by PDEs that can incorporate data to make decisions that are resilient to uncertainty. The proposed methods will provide new insights into nonsmooth nonconvex optimization, and they will be applied to applications such as identifying weakness in structures (e.g., bridges and nuclear plants). The results of this research will be shared with the community via publications and research talks. The outcomes of this research will benefit scientists working in multiple research areas such as numerical analysis, optimization, structural engineering and bioengineering. A PhD student will be fully supported by the project.
Particular focus of the project is on risk-averse optimization problems where the PDEs contain uncertainty arising from modeling unknown quantities (coefficients, boundary conditions, etc.) as random variables and dynamic optimization problems. The project will develop: (i) Inexact adaptive Semismooth Newton and Trust-region methods to solve these optimization problems; (ii) Primal dual methods for risk-averse optimization problems with general constraints; (iii) Applications to problems where inexactness arise from finite element discretization. Thrusts (i) and (iii) will enable interaction between finite element discretization and optimization solvers leading to structure preserving algorithms. Additionally, Thrust (ii) will lead to different penalty parameters for different constraints and will allow inexact solves at each iteration which is essential for large systems. This will enable a new paradigm for widely used penalty-based methods. Algorithms for high-dimensional nonsmooth risk-averse optimization will help overcome curse of dimensionality for similar problems.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2410737","Collaborative Research: High-order approximation of variational inequalities and bounds-constrained partial differential equations","DMS","COMPUTATIONAL MATHEMATICS","08/15/2024","08/02/2024","Daniel Shapero","WA","University of Washington","Continuing Grant","Yuliya Gorb","07/31/2027","$29,653.00","","shapero@uw.edu","4333 BROOKLYN AVE NE","SEATTLE","WA","981951016","2065434043","MPS","127100","9263","$0.00","Partial differential equations (PDE) describe a wide range of phenomena across science and engineering. Of particular interest in this project is a suite of geophysical problems, including (1) rainwater runoff over land surfaces, (2) meltwater flow through snowpack, and (3) the flow of glaciers. The complexity of these problems is far beyond the ability of classical techniques to give exact solutions, and computer simulations are required to understand the models. However, existing numerical methods fail to capture certain essential features present in the physical and mathematical systems. Frequently, these features take the form of inequality constraints -- glacier thickness or water depths cannot be negative. This project will develop new methods that provide accurate approximations to such models while fully respecting physical bounds constraints. Through the co-investigator's involvement in the Juneau Icefield Research Program, findings of this project will directly inform field research. The mathematical and computational techniques developed in this project will be incorporated in open source software projects that are widely used in research and educational endeavors, pushing forward the state of the art in scientific simulation. This project will train a doctoral student in mathematics and provide for undergraduate research experiences through the McNair Scholars program at Baylor University.
Many common variational problems obey a maximum principle, but only a restricted class of numerical methods preserves this important feature. An alternative is to explicitly enforce the maximum principle by recasting the problem as a variational inequality subject to physical bounds constraints. Additionally, many important problems inherently arise as variational inequalities. This project develops techniques for bounds-constrained solution of partial differential equations and variational inequalities. These techniques will be developed with suite of challenging geophysical problems in mind. These target applications include meltwater flow through snow, rainwater runoff over land surfaces, and glacier dynamics. These models are nonlinear, time-dependent, coupled partial differential equations with positivity constraints on a thickness variable. At the discrete level, bounds constraints are applied on the Bernstein control net of the finite element space. Since this allows for high-order spatial accuracy, it is also important to obtain high temporal accuracy for dynamic problems. This project will adapt Runge-Kutta methods to ensure bounds constraints hold over time. For single-stage methods, one can advance the solution by posing a variational inequality that minimizes a defect subject to bounds constraints. For higher-order time-stepping methods based on polynomial approximation, such as Galerkin-in-time or collocation-type Runge-Kutta methods, one can force the collocating polynomial to satisfy the bounds constraints uniformly in time, which may be especially important for tightly coupled processes. In addition to method development and computational applications, research will address how to adapt stability and convergence theory for Runge-Kutta methods to the bounds-constrained setting. To facilitate broader adoption by the computational mathematics and geophysics communities, newly-developed methodology will be included in the Firedrake project and the icepack library for glacier modeling built on top of it. Aspects of this research will be included in applied mathematics graduate classes at Baylor University and in earth and space sciences at the University of Washington.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2408928","A comprehensive mathematical and computational framework for next generation stent design","DMS","COMPUTATIONAL MATHEMATICS","08/01/2024","08/01/2024","Suncica Canic","CA","University of California-Berkeley","Standard Grant","Ludmil T. Zikatanov","07/31/2027","$220,000.00","","canics@berkeley.edu","1608 4TH ST STE 201","BERKELEY","CA","947101749","5106433891","MPS","127100","9263","$0.00","Stents are mesh-like tubes which hold blood vessels and air passages open. There are many types of stents, including bare metal stents, drug eluting stents (DES), airway stents, and stents that anchor bioartifical heart valves in transcatheter (aortic) valve replacement procedures. DES is a metallic mesh platform coated with an anti-inflammatory pharmacologic agent (drug) to reduce re-blocking (restenosis) of coronary arteries and allow normal blood supply to the heart muscle. Implantation of DES continues to be the method of choice in the treatment of patients with symptomatic coronary artery disease. The FDA first approved the use of 3D-printed airway stents in 2020. With the rapid development of 3D printing technology, it is only a matter of time until 3D printed vascular stent are slated for FDA approval. This is what makes this research pressing and timely. This project aims to deliver a comprehensive unified mathematical and computational framework for optimal stent design, producing digital stents ready for 3D printing, tailored to specific uses and patient geometries. In addition to developing novel mathematical and computational approaches, which will influence the field of mathematics, this project will produce tools for designing patient-specific digital stents. Furthermore, it will provide a platform for interdisciplinary mentoring of students and postdoctoral researchers by the main investigators, who include a mathematician, an engineer, and an interventional cardiologist.
The mathematical framework to be developed in this project, consists of three modules:
1. A reduced model optimization module for optimal design of mesh-like structures. This stent optimization algorithm outperforms classical engineering and ad hoc optimization approaches in terms of speed and accuracy, since it relies on sophisticated mathematical approaches rooted in dimension reduction modeling and optimization. This is the first stent optimization model (and a computational scheme) that is based on reduced, 1D network modeling of stents. A comparison with a Genetic Algorithm, Proper Orthogonal Decomposition, and Deep Autoencoder Neural Networks approaches will be performed. The stent prototypes will be 3D printed and mechanically tested in a Biomechanics Lab at Berkeley. Medical oversight will be provided by an interventional cardiologist.
2. A fluid-stent-poroelastic structure interaction module simulating the interaction between the blood flow and artery wall with implanted stent, where the arterial walls are modeled as poroelastic solids consisting of two layers: a thin reticular shell layer modeling the intimal layer of arterial walls, and a thick hyperelastic layer modeling the media-adventitia complex. This is the first fluid-structure interaction model that accounts for the multi-layered poroelastic structure of arterial walls, and it includes an implanted stent. A novel partitioned scheme to solve this problem will be developed.
3. A nonlinear advection-reaction-diffusion module simulating drug transport within the vascular wall and in the vascular lumen capturing the pharmacokinetics and advection, reaction, and diffusion processes of anti-inflammatory agents used to coat DES. The models are defined on moving domains. They utilize the advection velocity and the moving domain location calculated in Step 2 above. A monolithic computational scheme will be developed to solve the problem. This module is particularly relevant for the analysis of the performance of drug eluting stents.
An integral part of the project will be interdisciplinary student mentoring and research dissemination. This will be achieved by running a Hot Topics Workshop at the SLMath Institute, publishing in first-rate journals, and presenting research at mathematical, engineering, and medical conferences and workshops.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
-"2408877","Structure Preserving Optimization Algorithms and Digital Twins","DMS","OFFICE OF MULTIDISCIPLINARY AC, COMPUTATIONAL MATHEMATICS","08/01/2024","07/31/2024","Harbir Antil","VA","George Mason University","Standard Grant","Troy D. Butler","07/31/2027","$275,000.00","","hantil@gmu.edu","4400 UNIVERSITY DR","FAIRFAX","VA","220304422","7039932295","MPS","125300, 127100","075Z, 079Z, 9263","$0.00","Optimization problems constrained by physics are ubiquitous. These problems are nonlinear, nonsmooth and contain unknown parameters. The physics describing the constraints are partial differential equations (PDEs) which are multiscale, multiphysics, and geometric in nature. They capture many realistic scenarios: control of pathogen propagation like COVID-19, blood flow in aneurysms, determining weakness in structures, and vortex control in nuclear reactors. This project will study optimization problems constrained by PDEs that can incorporate data to make decisions that are resilient to uncertainty. The proposed methods will provide new insights into nonsmooth nonconvex optimization, and they will be applied to applications such as identifying weakness in structures (e.g., bridges and nuclear plants). The results of this research will be shared with the community via publications and research talks. The outcomes of this research will benefit scientists working in multiple research areas such as numerical analysis, optimization, structural engineering and bioengineering. A PhD student will be fully supported by the project.
Particular focus of the project is on risk-averse optimization problems where the PDEs contain uncertainty arising from modeling unknown quantities (coefficients, boundary conditions, etc.) as random variables and dynamic optimization problems. The project will develop: (i) Inexact adaptive Semismooth Newton and Trust-region methods to solve these optimization problems; (ii) Primal dual methods for risk-averse optimization problems with general constraints; (iii) Applications to problems where inexactness arise from finite element discretization. Thrusts (i) and (iii) will enable interaction between finite element discretization and optimization solvers leading to structure preserving algorithms. Additionally, Thrust (ii) will lead to different penalty parameters for different constraints and will allow inexact solves at each iteration which is essential for large systems. This will enable a new paradigm for widely used penalty-based methods. Algorithms for high-dimensional nonsmooth risk-averse optimization will help overcome curse of dimensionality for similar problems.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2432168","Conference: Interagency Analysis and Modeling Group/Multiscale Modeling Consortium (IMAG/MSM) Meeting on Operationalizing the NASEM Report on Digital Twins","DMS","STATISTICS, COMPUTATIONAL MATHEMATICS, MATHEMATICAL BIOLOGY","08/15/2024","08/02/2024","Gary An","VT","University of Vermont & State Agricultural College","Standard Grant","Troy D. Butler","07/31/2025","$30,000.00","","gan@med.med.edu","85 S PROSPECT STREET","BURLINGTON","VT","054051704","8026563660","MPS","126900, 127100, 733400","075Z, 079Z, 7556, 9150","$0.00","On December 15, 2023, The National Academies of Sciences, Engineering and Medicine (NASEM) released a report entitled: ?Foundational Research Gaps and Future Directions for Digital Twins? (?NASEM DT REPORT?). The purpose of this report was to bring structure to the burgeoning field of digital twins by providing a working definition and a series of research challenges that need to be addressed to allow this technology to fulfill its full potential. The concept of digital twins is compelling and has the potential to impact a broad range of domains. For instance, digital twins have either been proposed or are currently being developed for manufactured devices, buildings, cities, ecologies and the Earth as a whole. It is natural that the concept be applied to biology and medicine, as the most recognizable concept of a ?twin? is that of identical human twins. The application of digital twins to biomedicine also follows existing trends of Personalized and Precision medicine, in short: ?the right treatment for the right person at the right time.? Fulfilling the promise of biomedical digital twins will require multidisciplinary Team Science that brings together various experts from fields as diverse as medicine, computer science, engineering, biological research, advanced mathematics and ethics. The purpose of this conference, the ?2024 Interagency Modeling and Analysis Group (IMAG)/Multiscale Modeling (MSM) Consortium Meeting: Setting up Teams for Biomedical Digital Twins,? is to do exactly this: bringing together such needed expertise in a series of teaming exercises to operationalize the findings of the NASEM DT REPORT in the context of biomedical digital twins. As part of outreach and training efforts to broaden the participation within this growing field, this workshop will provide support for both traditionally under-represented categories of senior researchers as well as junior researchers such as graduate students and postdoctoral researchers.
Facilitating the development and deployment of biomedical digital twins requires operationalizing the findings and recommendations of the NASEM DT REPORT, which raises a series of specific and unique challenges in the biomedical domain. More specifically, there are numerous steps that need to be taken to convert the highly complex simulation models of biological processes developed by members of the MSM Consortium into biomedical digital twins that are compliant with the definition of digital twins presented in the NASEM DT REPORT. There are also identified challenges associated with these various steps. Some of these challenges can benefit from lessons learned in other domains that have developed digital twins while others will require the development of new techniques in the fields of statistics, computational mathematics and mathematical biology. This task will require multidisciplinary collaborations between mathematicians, computational researchers, experimental biologists and clinicians. This IMAG/MSM meeting will promote the concepts of Team Science to bring together experienced multiscale modeling researchers and experts from the mathematical, statistical, computational, experimental and clinical communities to form the multidisciplinary teams needed to operationalize the findings of the NASEM DT REPORT. The website for this meeting is at https://www.imagwiki.nibib.nih.gov/news-events/announcements/2024-imagmsm-meeting-september-30-october-2-2024, with the landing page for the Interagency Modeling and Analysis Group at https://www.imagwiki.nibib.nih.gov/.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2410736","Collaborative Research: High-order approximation of variational inequalities and bounds-constrained partial differential equations","DMS","COMPUTATIONAL MATHEMATICS","08/15/2024","08/02/2024","Robert Kirby","TX","Baylor University","Standard Grant","Yuliya Gorb","07/31/2027","$289,806.00","","Robert_Kirby@baylor.edu","700 S UNIVERSITY PARKS DR","WACO","TX","767061003","2547103817","MPS","127100","9263","$0.00","Partial differential equations (PDE) describe a wide range of phenomena across science and engineering. Of particular interest in this project is a suite of geophysical problems, including (1) rainwater runoff over land surfaces, (2) meltwater flow through snowpack, and (3) the flow of glaciers. The complexity of these problems is far beyond the ability of classical techniques to give exact solutions, and computer simulations are required to understand the models. However, existing numerical methods fail to capture certain essential features present in the physical and mathematical systems. Frequently, these features take the form of inequality constraints -- glacier thickness or water depths cannot be negative. This project will develop new methods that provide accurate approximations to such models while fully respecting physical bounds constraints. Through the co-investigator's involvement in the Juneau Icefield Research Program, findings of this project will directly inform field research. The mathematical and computational techniques developed in this project will be incorporated in open source software projects that are widely used in research and educational endeavors, pushing forward the state of the art in scientific simulation. This project will train a doctoral student in mathematics and provide for undergraduate research experiences through the McNair Scholars program at Baylor University.
Many common variational problems obey a maximum principle, but only a restricted class of numerical methods preserves this important feature. An alternative is to explicitly enforce the maximum principle by recasting the problem as a variational inequality subject to physical bounds constraints. Additionally, many important problems inherently arise as variational inequalities. This project develops techniques for bounds-constrained solution of partial differential equations and variational inequalities. These techniques will be developed with suite of challenging geophysical problems in mind. These target applications include meltwater flow through snow, rainwater runoff over land surfaces, and glacier dynamics. These models are nonlinear, time-dependent, coupled partial differential equations with positivity constraints on a thickness variable. At the discrete level, bounds constraints are applied on the Bernstein control net of the finite element space. Since this allows for high-order spatial accuracy, it is also important to obtain high temporal accuracy for dynamic problems. This project will adapt Runge-Kutta methods to ensure bounds constraints hold over time. For single-stage methods, one can advance the solution by posing a variational inequality that minimizes a defect subject to bounds constraints. For higher-order time-stepping methods based on polynomial approximation, such as Galerkin-in-time or collocation-type Runge-Kutta methods, one can force the collocating polynomial to satisfy the bounds constraints uniformly in time, which may be especially important for tightly coupled processes. In addition to method development and computational applications, research will address how to adapt stability and convergence theory for Runge-Kutta methods to the bounds-constrained setting. To facilitate broader adoption by the computational mathematics and geophysics communities, newly-developed methodology will be included in the Firedrake project and the icepack library for glacier modeling built on top of it. Aspects of this research will be included in applied mathematics graduate classes at Baylor University and in earth and space sciences at the University of Washington.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2410252","Machine Learning-Enabled Self-Consistent Field Theory for Soft Materials","DMS","OFFICE OF MULTIDISCIPLINARY AC, COMPUTATIONAL MATHEMATICS","08/01/2024","07/31/2024","Hector Ceniceros","CA","University of California-Santa Barbara","Standard Grant","Troy D. Butler","07/31/2027","$273,291.00","","hdc@math.ucsb.edu","3227 CHEADLE HALL","SANTA BARBARA","CA","931060001","8058934188","MPS","125300, 127100","075Z, 079Z, 9263","$0.00","Computer simulations are a powerful tool for understanding, predicting, and discovering soft material formulations such as polymer systems. Polymers are composed of long molecular chains and are ubiquitous in both synthetic (e.g., nylon, polyethylene, polyester, Teflon) and natural (e.g., DNA, proteins, cellulose, nucleic acids) settings. In this project, computational tools will be developed that combine machine learning and scientific computing for the exploration and prediction of polymer systems, which will also help to accelerate the discovery of new materials. More broadly, this project will provide a framework for similar computationally costly problems that could be dramatically expedited by using machine learning. Last but not least, the project will serve as an anchor for the interdisciplinary training of both undergraduate and graduate students in an emerging field of much demand.
Parameter space exploration for a soft material is an instance of the forward problem: given a set of parameters, find the corresponding stable morphology. But the inverse design problem, which consists of obtaining the formulation parameters that stabilize a given target morphology, is also of great technological importance as it facilitates the design of new functional materials with highly-tuned and desired properties. The numerical solution of both forward and inverse design problems requires the repeated evaluation of the computationally costly functions. The research team will develop efficient computational methods to enable polymer self-consistent field theory with machine learning to accelerate the solution of both forward and inverse design problems aimed at facilitating the discovery of new structures and the design of polymers and polymer systems.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2409634","Structure-Preserving Linear Schemes for the Diffuse-Interface Models of Incompressible Two-Phase Flows with Matched Densities","DMS","OFFICE OF MULTIDISCIPLINARY AC, COMPUTATIONAL MATHEMATICS","08/01/2024","08/01/2024","Lili Ju","SC","University of South Carolina at Columbia","Standard Grant","Ludmil T. Zikatanov","07/31/2027","$294,473.00","","ju@math.sc.edu","1600 HAMPTON ST","COLUMBIA","SC","292083403","8037777093","MPS","125300, 127100","9150, 9263","$0.00","As a fundamental example of multi-phase flows, two-phase flows are frequently encountered in natural and industrial processes, such as mixing of the fresh water and seawater at the estuaries in marine science, oil and gas transportation in the petroleum industry, the solidification of binary alloys in materials science and so on. The interfaces between the two immiscible fluids play a crucial role in these phenomena, and the diffusive-interface approach have been widely used for their modeling due to its significant advantages in handling topological changes and easiness of implementation. These two-phase flow systems also often possess some crucial physical structures, such as energy stability, bound preservation, and mass conservation. Preservation of these structures is not only a desirable attribute of numerical schemes for their high-fidelity simulations in scientific and engineering applications but also plays a pivotal role in stability and error analysis of the numerical schemes. The project involves diverse research work in computational and applied mathematics, ranging from algorithm design, numerical analysis, to computer implementation. The research results produced from this project will be actively disseminated through publishing papers, giving talks, organizing mini-symposia and workshops, maintaining informative websites, and delivering software codes. Moreover, this project's broader impact also includes its potential to offer an exceptional opportunity for graduate students to engage in diverse interdisciplinary mathematics research.
The primary goal of the project is to develop and analyze efficient, robust, and accurate structure-preserving linear schemes for simulating diffuse-interface models of incompressible two-phase flows with matched densities. In particular, the research activities include 1) the development and analysis of robust energy-stable, decoupled linear schemes for the fluid dynamics equations based on regularization techniques, 2) accurate bound-preserving and energy-stable linear schemes for the phase field equations by utilizing the backward differentiation formulas and the prediction-correction approach, and 3) effective linear relaxation methods with structure preservation for decoupling the fluid dynamics and phase field solvers of the flow system. This project will lead to significant innovations in computational tools with high-efficiency and high-fidelity for simulations of the Navier-Stokes-Allen-Cahn and Navier-Stokes-Cahn-Hilliard systems. In addition, it will offer new insights into outstanding algorithmic issues on bound preservation and energy stability of numerical discretization for two-phase and general multi-phase flows in various scientific and engineering applications.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2408978","Finite element methods for complex surface fluids","DMS","COMPUTATIONAL MATHEMATICS","07/01/2024","05/30/2024","Maxim Olshanskiy","TX","University of Houston","Standard Grant","Yuliya Gorb","06/30/2027","$319,951.00","","molshan@math.uh.edu","4300 MARTIN LUTHER KING BLVD","HOUSTON","TX","772043067","7137435773","MPS","127100","9263","$0.00","Material interfaces with lateral fluidity are widespread in biology and are vital for processes at various scales, from subcellular to tissue levels. Mathematical models describe these interfaces using systems of partial differential equations on deforming surfaces, sometimes linked to equations in the bulk. These equations govern the movement of interfaces and fluid flow along them and in the surrounding medium. While existing studies often focus on simple, homogeneous fluid flows on steady surfaces, real-life scenarios are more complex. This research project will develop and analyze new computational methods for studying these complex fluid systems. In addition, open-source software for simulating evolving surface PDEs will be developed and the project will provide research training opportunities for students.
This project will develop and analyze a finite element method for the tangential fluid system posed on a moving surface, a multi-component surface flow problem, and a fluid-elastic interface model, all arising in the continuum modeling of inextensible viscous deformable membranes. A numerical approach will be employed in the project that falls into the category of geometrically unfitted discretizations. It will allow for large surface deformations, avoid the need for surface parametrization and triangulation, and have optimal complexity. The developed technique will incorporate an Eulerian treatment of time derivatives in evolving domains and employ physics-based stable and linear splitting schemes. The particular problems that will be addressed include the analysis of finite element methods for the Boussinesq-Scriven fluid problem on a passively evolving surface; the development of a stable linear finite element scheme for a phase-field model of two-phase surface flows on both steady and evolving surfaces; and the construction of a splitting scheme for equations governing the motion of a material surface exhibiting lateral fluidity and out-of-plane elasticity.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2433859","Conference: 1st SIAM Northern and Central California Sectional Conference","DMS","APPLIED MATHEMATICS, COMPUTATIONAL MATHEMATICS","09/01/2024","07/25/2024","Noemi Petra","CA","University of California - Merced","Standard Grant","Hailiang Liu","08/31/2025","$40,000.00","Changho Kim, Erica Rutter, Boaz Ilan, Roummel Marcia","npetra@ucmerced.edu","5200 N LAKE RD","MERCED","CA","953435001","2092012039","MPS","126600, 127100","075Z, 079Z, 7556, 9263","$0.00","The Society for Industrial and Applied Mathematics (SIAM) recently recognized the establishment of the Northern and Central California (SIAM-NCC) Section, whose primary goal is to provide an ongoing opportunity for mathematicians working in the sectors of academia, national laboratory, industry, and government to come together and form a strong social and professional network. The first SIAM-NCC conference scheduled to be held at the University of California, Merced campus during October 9-11, 2024 has the following aims: (1) create an opportunity for scientific researchers in the central and northern California regions to meet, network, and share the innovations and recent developments in their fields; (2) attract and energize a diverse group of students and researchers particularly those from underrepresented minority groups; (3) offer opportunities to SIAM members from various institutions in the region to present their work, who for various reasons often struggle to participate at national and international SIAM meetings; and (4) provide early career researchers to connect with others who are at similar career stages. The broader goal of this conference is to bring together a diverse group of students and researchers particularly those from underrepresented minority groups and create opportunities for sharing ideas and networking. The central and northern California regions provide rich opportunities for involving students from underrepresented and financially challenged populations majoring in science, technology, engineering, and mathematics (STEM) fields.
The 2024 SIAM-NCC Conference is centered around the following five research themes of applied mathematics: (1) mathematical and numerical analysis; (2) optimization, inverse problems, and optimal experimental design; (3) scientific and high-performance computing; (4) uncertainty quantification and prediction; and (5) scientific machine learning (ML), artificial intelligence (AI), and digital twins. The conference will feature four plenary speakers from industry, academia, and national laboratory. Ten mini-symposia are planned to capture the conference themes in critical areas of research in applied mathematics. Four panels will cover a variety of topics aimed to reach undergraduate and graduate students, early career researchers, and the greater scientific community. In particular, topics include (1) career opportunities for undergraduate students, (2) transitioning from student to researcher (e.g., preparing for internships and postdoc positions), (3) industry and laboratory careers, and (4) the role of AI/ML in science and technology. Finally, to facilitate a more open and informal discussion about research and career opportunities, to accommodate broader research themes, and to offer opportunity for all attendees to present their work, two poster sessions are also scheduled. Undergraduate and graduate students, as well as postdoctoral scholars and other early career researchers, will be particularly encouraged to participate in these sessions. The conference website is: https://sites.google.com/view/siam-ncc/siam-ncc-conference-2024.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
-"2424305","Comparative Study of Finite Element and Neural Network Discretizations for Partial Differential Equations","DMS","COMPUTATIONAL MATHEMATICS","03/15/2024","03/15/2024","Jonathan Siegel","TX","Texas A&M University","Continuing Grant","Yuliya Gorb","07/31/2025","$140,889.00","","jwsiegel@tamu.edu","400 HARVEY MITCHELL PKY S STE 30","COLLEGE STATION","TX","778454375","9798626777","MPS","127100","079Z, 9263","$0.00","This research connects two different fields, machine learning from data science and numerical partial differential equations from scientific and engineering computing, through the comparative study of the finite element method and finite neuron method. Finite element methods have undergone decades of study by mathematicians, scientists and engineers in many fields and there is a rich mathematical theory concerning them. They are widely used in scientific computing and modelling to generate accurate simulations of a wide variety of physical processes, most notably the deformation of materials and fluid mechanics. By contrast, deep neural networks are relatively new and have only been widely used in the last decade. In this short time, they have demonstrated remarkable empirical performance on a wide variety of machine learning tasks, most notably in computer vision and natural language processing. Despite this great empirical success, there is still a very limited mathematical understanding of why and how deep neural networks work so well. We hope to leverage the success of deep learning to improve numerical methods for partial differential equations and to leverage the theoretical understanding of the finite element method to better understand deep learning. The interdisciplinary nature of the research will also provide a good training experience for junior researchers. This project will support 1 graduate student each year of the three year project.
Piecewise polynomials represent one of the most important functional classes in approximation theory. In classical approximation theory and numerical methods for partial differential equations, these functional classes are often represented by linear functional spaces associated with a priori given grids, for example, by splines and finite element spaces. In deep learning, function classes are typically represented by a composition of a sequence of linear functions and coordinate-wise non-linearities. One important non-linearity is the rectified linear unit (ReLU) function and its powers (ReLUk). The resulting functional class, ReLUk-DNN, does not form a linear vector space but is rather parameterized non-linearly by a high-dimensional set of parameters. This function class can be used to solve partial differential equations and we call the resulting numerical algorithms the finite neuron method (FNM). Proposed research topics include: error estimates for the finite neuron method, universal construction of conforming finite elements for arbitrarily high order partial differential equations, an investigation into how and why the finite neuron method gives a much better asymptotic error estimate than the corresponding finite element method, and the development and analysis of efficient algorithms for using the finite neuron method.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2410671","Robust Algorithms Based on Domain Decomposition and Microlocal-Analysis for Wave propagation","DMS","COMPUTATIONAL MATHEMATICS","07/01/2024","06/14/2024","Yassine Boubendir","NJ","New Jersey Institute of Technology","Standard Grant","Ludmil T. Zikatanov","06/30/2027","$200,000.00","","boubendi@njit.edu","323 DR MARTIN LUTHER KING JR BLV","NEWARK","NJ","071021824","9735965275","MPS","127100","9263","$0.00","More than ever, technological advances in industries such as aerospace, microchips, telecommunications, and renewable energy rely on advanced numerical solvers for wave propagation. The aim of this project is the development of efficient and accurate algorithms for acoustic and electromagnetic wave propagation in complex domains containing, for example, inlets, cavities, or a multilayer structure. These geometrical features continue to pose challenges for numerical computation. The numerical methods developed in this project will have application to radar, communications, remote sensing, stealth technology, satellites, and many others. Fundamental theoretical and computational issues as well as realistic complex geometries such as those occurring in aircraft and submarines will be addressed in this project. The obtained algorithms will facilitate the use of powerful computers when simulating industrial high-frequency wave problems. The numerical solvers obtained through this research will be made readily available to scientists in aerospace and other industries, which will contribute to enhancing the U.S leadership in this field. Several aspects in this project will benefit the education of both undergraduate and graduate students. Graduate students will gain expertise in both scientific computing and mathematical analysis. This will reinforce their preparation to face future challenges in science and technology.
The aim of this project is the development of efficient and accurate algorithms for acoustic and electromagnetic wave propagation in complex domains. One of the main goals of this project resides in the design of robust algorithms based on high-frequency integral equations, microlocal and numerical analysis, asymptotic methods, and finite element techniques. The investigator plans to derive rigorous asymptotic expansions for incidences more general than plane waves in order to support the high-frequency integral equation multiple scattering iterative procedure. The investigator will introduce Ray-stabilized Galerkin boundary element methods, based on a new theoretical development on ray tracing, to significantly reduce the computational cost at each iteration and limit the exponentially increasing cost of multiple scattering iterations to a fixed number. Using the theoretical findings in conjunction with the stationary phase lemma, frequency-independent quadratures for approximating the multiple scattering amplitude will also be designed. These new methods will be beneficial for industrial applications involving multi-component radar and antenna design. In addition, this project includes development of new non-overlapping domain decomposition methods with considerably enhanced convergence characteristics. The main idea resides in a novel treatment of the continuity conditions in the neighborhood of the so called cross-points. Analysis of the convergence and stability will be included in parallel to numerical simulations in the two and three dimensional cases using high performance computing.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2409903","Development of novel numerical methods for forward and inverse problems in mean field games","DMS","COMPUTATIONAL MATHEMATICS","07/01/2024","06/11/2024","Yat Tin Chow","CA","University of California-Riverside","Continuing Grant","Troy D. Butler","06/30/2027","$95,280.00","","yattinc@ucr.edu","200 UNIVERSTY OFC BUILDING","RIVERSIDE","CA","925210001","9518275535","MPS","127100","9263","$0.00","Mean field games is the study of strategic decision making in large populations where individual players interact through a certain quantity in the mean field. Mean field games have strong descriptive power in socioeconomics and biology, e.g. in the understanding of social cooperation, stock markets, trading and economics, biological systems, election dynamics, population games, robotic control, machine learning, dynamics of multiple populations, pandemic modeling and control as well as vaccination distribution. It is therefore essential to develop accurate numerical methods for large-scale mean field games and their model recovery. However, current computational approaches for the recovery problem are impractical in high dimensions. This project will comprehensively study new computational methods for both large-scale mean field games and their model recovery. The comprehensive plans will cover algorithmic development, theoretical analysis, numerical implementation and practical applications. The project will also involve research on speeding up the forward and inverse problem computations to speed up the computation for mean field game modeling and turn real life mean field game model recovery problems from computationally unaffordable to affordable. The research team will disseminate results through publications, professional presentations, the training of graduate students at the University of California, Riverside as well as through public outreach events that involve public talks and engagement with high school math fairs. The goals of these outreach events are to increase public literacy and public engagement in mathematics, improve STEM education and educator development, and broaden participation of women and underrepresented minorities.
The project will provide novel computational methods for both forward and inverse problems of mean field games. The team will (1) develop two new numerical methods for forward problems in mean field games, namely monotone inclusion with Benamou-Brenier's formulation and extragradient algorithm with moving anchoring; (2) develop three new numerical methods for inverse problems in mean field games with only boundary measurements, namely a three-operator splitting scheme, a semi-smooth Newton acceleration method, and a direct sampling method. Both theoretical analysis and practical implementations will be emphasized. In particular, numerical methods for inverse problems for mean field games, which is a main target of the project, will be designed to work with only boundary measurements. This represents a brand new field in inverse problems and optimization. The project will also seek the simultaneous reconstruction of coefficients in the severely ill-posed case when only noisy boundary measurements from one or two measurement events are available.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2424305","Comparative Study of Finite Element and Neural Network Discretizations for Partial Differential Equations","DMS","COMPUTATIONAL MATHEMATICS","03/15/2024","03/15/2024","Jonathan Siegel","TX","Texas A&M University","Continuing Grant","Yuliya Gorb","07/31/2025","$140,889.00","","jwsiegel@tamu.edu","400 HARVEY MITCHELL PKY S STE 30","COLLEGE STATION","TX","778454375","9798626777","MPS","127100","079Z, 9263","$0.00","This research connects two different fields, machine learning from data science and numerical partial differential equations from scientific and engineering computing, through the comparative study of the finite element method and finite neuron method. Finite element methods have undergone decades of study by mathematicians, scientists and engineers in many fields and there is a rich mathematical theory concerning them. They are widely used in scientific computing and modelling to generate accurate simulations of a wide variety of physical processes, most notably the deformation of materials and fluid mechanics. By contrast, deep neural networks are relatively new and have only been widely used in the last decade. In this short time, they have demonstrated remarkable empirical performance on a wide variety of machine learning tasks, most notably in computer vision and natural language processing. Despite this great empirical success, there is still a very limited mathematical understanding of why and how deep neural networks work so well. We hope to leverage the success of deep learning to improve numerical methods for partial differential equations and to leverage the theoretical understanding of the finite element method to better understand deep learning. The interdisciplinary nature of the research will also provide a good training experience for junior researchers. This project will support 1 graduate student each year of the three year project.
Piecewise polynomials represent one of the most important functional classes in approximation theory. In classical approximation theory and numerical methods for partial differential equations, these functional classes are often represented by linear functional spaces associated with a priori given grids, for example, by splines and finite element spaces. In deep learning, function classes are typically represented by a composition of a sequence of linear functions and coordinate-wise non-linearities. One important non-linearity is the rectified linear unit (ReLU) function and its powers (ReLUk). The resulting functional class, ReLUk-DNN, does not form a linear vector space but is rather parameterized non-linearly by a high-dimensional set of parameters. This function class can be used to solve partial differential equations and we call the resulting numerical algorithms the finite neuron method (FNM). Proposed research topics include: error estimates for the finite neuron method, universal construction of conforming finite elements for arbitrarily high order partial differential equations, an investigation into how and why the finite neuron method gives a much better asymptotic error estimate than the corresponding finite element method, and the development and analysis of efficient algorithms for using the finite neuron method.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2403506","Conference: Power of Diversity in Uncertainty Quantification (PoD UQ)","DMS","COMPUTATIONAL MATHEMATICS","02/01/2024","01/30/2024","Annalisa Quaini","TX","University of Houston","Standard Grant","Troy D. Butler","01/31/2025","$23,500.00","","quaini@math.uh.edu","4300 MARTIN LUTHER KING BLVD","HOUSTON","TX","772043067","7137435773","MPS","127100","7556, 9263","$0.00","The Power of Diversity in Uncertainty Quantification (PoD UQ) workshop will be a one-day meeting hosted by the International School for Advanced Studies in Trieste, Italy, on February 26, 2024. This is the day prior to the beginning of the next SIAM Conference on Uncertainty Quantification (SIAM UQ24), which will be held in Trieste from February 27 to March 1, 2024. SIAM UQ24 is dedicated to recognizing the natural synergy between computational models and statistical methods, strengthened by the emergence of machine learning as a practical tool. Thus, for the first time there will be a large international meeting on the whole UQ ecosystem viewed through a unifying lens. For graduate students and young researchers entering the field of UQ, SIAM UQ24 offers a unique opportunity to learn from and exchange ideas with diverse groups of UQ professionals from academia, industry, and government laboratories. As attractive as this opportunity is, the size and breadth of the conference could be daunting. PoD UQ is targeted to graduate students and young researchers to ease their approach to SIAM UQ24 and ensure they make the most out of it.
The name of the event highlights the central role played by diversity in UQ, a field whose advancement requires the integration of mathematics and statistics, theory and computations. Diversity refers also to the community of under-represented groups that the event targets for greater inclusivity. The goals of PoD UQ are to: (i) Introduce students and early-career researchers to the state-of-the-art and current trends in modeling, sampling, and analyzing uncertainties. The meeting will feature three talks meant to give an overview of the different areas in UQ and two introductory talks on current hot topics. All the talks will be delivered by internationally renowned leaders in the field. In addition, a panel of established UQ researchers will discuss the opportunities and challenges of working in an application-driven and inherently interdisciplinary field that relies on a broad range of mathematical and statistical foundations, domain knowledge, and algorithmic and computational tools. (ii) Offer an excellent chance for networking with both peers and more established researchers. Recognizing the importance of a supportive network in building a career, especially for people from minority groups, the schedule of PoD UQ includes ample time to connect and interact. The participants will have two coffee breaks and a generous lunch break to interact among themselves. A poster sessions and seated dinner with assigned seats will ensure that the participants get to interact with the speakers and the panelists. More details can be found at: http://go.sissa.it/siamuq24satelliteevent
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2409807","Approximating partial differential equations without boundary conditions","DMS","COMPUTATIONAL MATHEMATICS","10/01/2024","06/03/2024","Andrea Bonito","TX","Texas A&M University","Standard Grant","Yuliya Gorb","09/30/2027","$399,583.00","","bonito@math.tamu.edu","400 HARVEY MITCHELL PKY S STE 30","COLLEGE STATION","TX","778454375","9798626777","MPS","127100","9263","$0.00","The predicting power of computational tools is of paramount importance in engineering and science. They offer insights into the behavior of complex systems, modeled by partial differential equations inside a region of interest. Boundary conditions expressing the influence of the surroundings must be provided to complete the mathematical models. However, there are many instances for which the boundary conditions are not available to practitioners: the understanding of the physical processes might be lacking, for instance when modeling the airflow around an airplane, or the boundary data is not accessible. This project aims to design numerical algorithms able to alleviate missing information on boundary conditions by incorporating physical measurements of the quantity of interest. The problems to be addressed fit under the strategic area of machine learning, and the potential scientific impact of this research is far-reaching. It includes improved meteorological forecasting, discovering biological pathways, and commercial design.
In traditional numerical treatments of elliptic partial differential equations, the solution to be approximated is completely characterized by the given data. However, there are many instances for which the boundary conditions are not available. While not sufficient to pinpoint the solution, measurements of the solution are provided to attenuate the incomplete information. The aim of this research program is to exploit the structure provided by the PDE to design and analyze practical numerical algorithms able to construct the best simultaneous approximation of all functions satisfying the PDE and the measurements. This project embeds the design, analysis, and implementation of numerical methods for PDEs within an optimal recovery framework. It uncovers uncharted problematics requiring new mathematical tools.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2409868","On Iteratively Regularized Alternating Minimization under Nonlinear Dynamics Constraints with Applications to Epidemiology","DMS","COMPUTATIONAL MATHEMATICS","09/01/2024","05/29/2024","Alexandra Smirnova","GA","Georgia State University Research Foundation, Inc.","Standard Grant","Troy D. Butler","08/31/2027","$200,000.00","Xiaojing Ye","asmirnova@gsu.edu","58 EDGEWOOD AVE NE","ATLANTA","GA","303032921","4044133570","MPS","127100","9263","$0.00","How widely has the virus spread? This important and often overlooked question was brought to light by the recent COVID-19 outbreak. Several techniques have been used to account for silent spreaders along with varying testing and healthcare seeking habits as the main reasons for under-reporting of incidence cases. It has been observed that silent spreaders play a more significant role in disease progression than previously understood, highlighting the need for policymakers to incorporate these hidden figures into their strategic responses. Unlike other disease parameters, i.e., incubation and recovery rates, the case reporting rate and the time-dependent effective reproduction number are directly influenced by a large number of factors making it impossible to directly quantify these parameters in any meaningful way. This project will advance iteratively regularized numerical algorithms, which have emerged as a powerful tool for reliable estimation (from noise-contaminated data) of infectious disease parameters that are crucial for future projections, prevention, and control. Apart from epidemiology, the project will benefit all real-world applications involving massive amounts of observation data for multiple stages of the inversion process with a shared model parameter. In the course of their theoretical and numerical studies, the PIs will continue to create research opportunities for undergraduate and graduate students, including women and students from groups traditionally underrepresented in STEM disciplines. A number of project topics are particularly suitable for student research and will be used to train some of the next generation of computational mathematicians.
In the framework of this project, the PIs will develop new regularized alternating minimization algorithms for solving ill-posed parameter-estimation problems constrained by nonlinear dynamics. While significant computational challenges are shared by both deterministic trust-region and Bayesian methods (such as numerical solutions requiring solutions to possibly complex ODE or PDE systems at every step of the iterative process), the team will address these challenges by constructing a family of fast and stable iteratively regularized optimization algorithms, which carefully alternate between updating model parameters and state variables.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
@@ -24,11 +25,11 @@
"2411264","NSF-BSF: Scalable Graph Neural Network Algorithms and Applications to PDEs","DMS","COMPUTATIONAL MATHEMATICS","08/01/2024","06/21/2024","Lars Ruthotto","GA","Emory University","Continuing Grant","Troy D. Butler","07/31/2027","$121,190.00","","lruthotto@emory.edu","201 DOWMAN DR NE","ATLANTA","GA","303221061","4047272503","MPS","127100","079Z, 9263","$0.00","This project will advance the fields of geometric machine learning and numerical partial differential equations and strengthen the connections between them. Geometric machine learning provides an effective approach for analyzing unstructured data and has become indispensable for computer graphics and vision, bioinformatics, social network analysis, protein folding, and many other areas. Partial differential equations (PDEs) are ubiquitous in mathematical modeling, and their numerical solution enables the simulation of real-world phenomena in engineering design, medical analysis, and material sciences, to name a few. A unified study of both fields exposes many potential synergies, which the project will seize to improve the efficiency of algorithms in both areas. The first goal is to improve the scalability of geometric machine learning approaches based on graph neural networks (GNNs) to accommodate growing datasets with millions of nodes using insights and ideas from numerical PDEs. The second goal is to accelerate numerical PDE simulations by enhancing numerical solvers on unstructured meshes with GNN components. Through these improvements in computational efficiency, the project will enable more accurate data analysis and PDE simulations for high-impact applications across the sciences, engineering, and industry. Graduate students and postdoctoral researchers will be integrated into this research as part of their professional training.
This project will develop computational algorithms that improve the efficiency and scalability of GNNs and create new approaches for GNNs for solving nonlinear PDEs on unstructured meshes. To improve the scalability of GNNs to graphs with millions of nodes, the research team will develop spatial smoothing operators, coarsening operators, and multilevel training schemes. To accelerate PDE simulations on unstructured meshes, the team will train GNNs to produce effective prolongation, restriction, and coarse mesh operators in multigrid methods and preconditioners in Krylov methods. The team will demonstrate that the resulting hybrid schemes accelerate computations and are provably convergent. To show the broad applicability of the schemes, the team will consider challenging PDE problems in computational fluid dynamics and test the scalable GNNs on established geometric learning benchmark tasks such as shape and node classification. The mathematical backbone of these developments is algebraic multigrid techniques, which motivate GNN design and training and are used in the PDE solvers.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2414705","CAREER: Mathematical Modeling from Data to Insights and Beyond","DMS","COMPUTATIONAL MATHEMATICS","01/15/2024","01/22/2024","Yifei Lou","NC","University of North Carolina at Chapel Hill","Continuing Grant","Yuliya Gorb","05/31/2025","$141,540.00","","yflou@unc.edu","104 AIRPORT DR STE 2200","CHAPEL HILL","NC","275995023","9199663411","MPS","127100","1045, 9263","$0.00","This project will develop both analytical and computational tools for data-driven applications. In particular, analytical tools will hold great promise to provide theoretical guidance on how to acquire data more efficiently than current practices. To retrieve useful information from data, numerical methods will be investigated with emphasis on guaranteed convergence and algorithmic acceleration. Thanks to close interactions with collaborators in data science and information technology, the investigator will ensure the practicability of the proposed research, leading to a real impact. The investigator will also devote herself to various outreach activities in the field of data science. For example, she will initiate a local network of students, faculty members, and domain experts to develop close ties between mathematics and industry as well as to broaden career opportunities for mathematics students. This initiative will have a positive impact on the entire mathematical sciences community. In addition, she will advocate for the integration of mathematical modeling into K-16 education by collaborating with The University of Texas at Dallas Diversity Scholarship Program to reach out to mathematics/sciences teachers.
This project addresses important issues in extracting insights from data and training the next generation in the ""big data"" era. The research focuses on signal/image recovery from a limited number of measurements, in which ""limited"" refers to the fact that the amount of data that can be taken or transmitted is limited by technical or economic constraints. When data is insufficient, one often requires additional information from the application domain to build a mathematical model, followed by numerical methods. Questions to be explored in this project include: (1) how difficult is the process of extracting insights from data? (2) how should reasonable assumptions be taken into account to build a mathematical model? (3) how should an efficient algorithm be designed to find a model solution? More importantly, a feedback loop from insights to data will be introduced, i.e., (4) how to improve upon data acquisition so that information becomes easier to retrieve? As these questions mimic the standard procedure in mathematical modeling, the proposed research provides a plethora of illustrative examples to enrich the education of mathematical modeling. In fact, one of this CAREER award's educational objectives is to advocate the integration of mathematical modeling into K-16 education so that students will develop problem-solving skills in early ages. In addition, the proposed research requires close interactions with domain experts in business, industry, and government (BIG), where real-world problems come from. This requirement helps to fulfill another educational objective, that is, to promote BIG employment by providing adequate training for students in successful approaches to BIG problems together with BIG workforce skills.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2411396","Interacting particle system for nonconvex optimization","DMS","COMPUTATIONAL MATHEMATICS","07/01/2024","06/13/2024","Yuhua Zhu","CA","University of California-San Diego","Continuing Grant","Troy D. Butler","06/30/2027","$80,048.00","","yuz244@ucsd.edu","9500 GILMAN DR","LA JOLLA","CA","920930021","8585344896","MPS","127100","9263","$0.00","Collective Intelligence offers profound insights into how groups, whether they be cells, animals, or even machines, can work together to accomplish tasks more effectively than individuals alone. Originating in biology and now influencing fields as varied as management science, artificial intelligence, and robotics, this concept underscores the potential of collaborative efforts in solving complex challenges. On the other hand, the quest for finding global minimizers of nonconvex optimization problems arises in physics and chemistry, as well as in machine learning due to the widespread adoption of deep learning. Building the bridge between these two seemingly disparate realms, this project will utilize Collective Intelligence to leverage the interacting particle systems as a means to address the formidable challenge of finding global minimizers in nonconvex optimization problems. Graduate students will also be integrated within the research team as part of their professional training.
This project will focus on a gradient-free optimization method inspired by a consensus-based interacting particle system to solve different types of nonconvex optimization problems. Effective communication and cooperation among particles within the system play pivotal roles in efficiently exploring the landscape and converging to the global minimizer. Aim 1 targets nonconvex optimization with equality constraints; and Aim 2 addresses nonconvex optimization on convex sets; while Aim 3 applies to Clustered Federated Learning. Additionally, convergence guarantees will be provided for nonconvex and nonsmooth objective functions. Theoretical analyses, alongside practical implementations, will provide valuable insights and tools for addressing different types of nonconvex optimization challenges.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
-"2416250","Theory and algorithms for a new class of computationally amenable nonconvex functions","DMS","COMPUTATIONAL MATHEMATICS","03/01/2024","03/12/2024","Ying Cui","CA","University of California-Berkeley","Standard Grant","Jodi Mead","06/30/2026","$240,330.00","","yingcui@berkeley.edu","1608 4TH ST STE 201","BERKELEY","CA","947101749","5106433891","MPS","127100","079Z, 9263","$0.00","As the significance of data science continues to expand, nonconvex optimization models become increasingly prevalent in various scientific and engineering applications. Despite the field's rapid development, there are still a host of theoretical and applied problems that so far are left open and void of rigorous analysis and efficient methods for solution. Driven by practicality and reinforced by rigor, this project aims to conduct a comprehensive investigation of composite nonconvex optimization problems and games. The technologies developed will offer valuable tools for fundamental science and engineering research, positively impacting the environment and fostering societal integration with the big-data world. Additionally, the project will educate undergraduate and graduate students, cultivating the next generation of experts in the field.
This project seeks to advance state-of-the-art techniques for solving nonconvex optimization problems and games through both theoretical and computational approaches. At its core is the innovative concept of ""approachable difference-of-convex functions,"" which uncovers a hidden, asymptotically decomposable structure within the multi-composition of nonconvex and non-smooth functions. The project will tackle three main tasks: (i) establishing fundamental properties for a novel class of computationally amenable nonconvex and non-smooth composite functions; (ii) designing and analyzing computational schemes for single-agent optimization problems, with objective and constrained functions belonging to the aforementioned class; and (iii) extending these approaches to address nonconvex games.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2410678","Collaborative Research: Data-driven Realization of State-space Dynamical Systems via Low-complexity Algorithms","DMS","COMPUTATIONAL MATHEMATICS","08/01/2024","06/07/2024","Aaron Welters","FL","Florida Institute of Technology","Standard Grant","Jodi Mead","07/31/2027","$125,000.00","Xianqi Li","awelters@fit.edu","150 W UNIVERSITY BLVD","MELBOURNE","FL","329018995","3216748000","MPS","127100","079Z, 9263","$0.00","Data science is evolving rapidly and places a new perspective on realizing state-space dynamical systems. Predicting time-advanced states of dynamical systems is a challenging problem in STEM disciplines due to their nonlinear and complex nature. This project will utilize data-driven methods and analyze state-space dynamical systems to predict and understand future states, surpassing classical techniques. In addition, the PI team will (i) guide students to obtain cross-discipline PhD/Master's degrees, (ii) guide students to work in a peer-learning environment, and (iii) educate a diverse group of undergraduates.
In more detail, this project will utilize state-of-the-art machine learning (ML) algorithms to efficiently analyze and predict information within data matrices and tensor computations with low-complexity algorithms. Single-dimensional ML models are not efficient at extracting hidden semantic information in the time and space domains. As a result, it becomes challenging to simultaneously capture multi-dimensional spatiotemporal data in state-space dynamical systems. Using efficient ML algorithms to recover multi-dimensional spatiotemporal data simultaneously offers a breakthrough in understanding the chaotic behavior of dynamical systems. This project will (i) utilize ML to predict future states of dynamical systems based on high-dimensional data matrices captured at different time stamps, (ii) realize state-space controllable and observable systems via low-complexity algorithms to simultaneously analyze multiple states of the systems, (iii) analyze noise in state-space systems for uncertainty quantification, predict patterns in real-time states, generate counter-resonance states to suppress them, and optimize performance and stability, (iv) study system resilience via multiple state predictors and perturbations to assess performance and adaptation to disturbances and anomalies, and finally (v) optimize spacecraft trajectories, avoid impact, and use low-complexity algorithms to understand spacecraft launch dynamics on the space coast and support ERAU's mission in aeronautical research.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2416250","Theory and algorithms for a new class of computationally amenable nonconvex functions","DMS","COMPUTATIONAL MATHEMATICS","03/01/2024","03/12/2024","Ying Cui","CA","University of California-Berkeley","Standard Grant","Jodi Mead","06/30/2026","$240,330.00","","yingcui@berkeley.edu","1608 4TH ST STE 201","BERKELEY","CA","947101749","5106433891","MPS","127100","079Z, 9263","$0.00","As the significance of data science continues to expand, nonconvex optimization models become increasingly prevalent in various scientific and engineering applications. Despite the field's rapid development, there are still a host of theoretical and applied problems that so far are left open and void of rigorous analysis and efficient methods for solution. Driven by practicality and reinforced by rigor, this project aims to conduct a comprehensive investigation of composite nonconvex optimization problems and games. The technologies developed will offer valuable tools for fundamental science and engineering research, positively impacting the environment and fostering societal integration with the big-data world. Additionally, the project will educate undergraduate and graduate students, cultivating the next generation of experts in the field.
This project seeks to advance state-of-the-art techniques for solving nonconvex optimization problems and games through both theoretical and computational approaches. At its core is the innovative concept of ""approachable difference-of-convex functions,"" which uncovers a hidden, asymptotically decomposable structure within the multi-composition of nonconvex and non-smooth functions. The project will tackle three main tasks: (i) establishing fundamental properties for a novel class of computationally amenable nonconvex and non-smooth composite functions; (ii) designing and analyzing computational schemes for single-agent optimization problems, with objective and constrained functions belonging to the aforementioned class; and (iii) extending these approaches to address nonconvex games.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2411208","Collaborative Research: Numerical Methods and Differential Geometry","DMS","COMPUTATIONAL MATHEMATICS","06/15/2024","06/04/2024","Evan Gawlik","HI","University of Hawaii","Standard Grant","Yuliya Gorb","05/31/2027","$190,527.00","","egawlik@hawaii.edu","2425 CAMPUS RD SINCLAIR RM 1","HONOLULU","HI","968222247","8089567800","MPS","127100","9150, 9263","$0.00","Partial differential equations (PDEs) model a wide variety of phenomena, ranging from how an airplane wing deforms in response to turbulence, to how radio waves travel through and around objects, to how black holes generate gravitational waves when they merge. Numerical analysts develop algorithms for simulating these systems by solving PDEs on a computer; these simulations enable engineers and scientists to develop prototypes and to interpret data from sensors. For example, the NSF-funded Nobel-winning detection of gravitational waves would not have been possible without advances in numerical analysis. In recent decades, numerical analysts discovered that ideas from differential geometry, an area of pure mathematics, can be used to develop good algorithms for solving PDEs. In fact, these ideas help not only for geometric problems in fields of study like computer vision and general relativity, but also for fields like electromagnetism that have little to do with geometry. Although applying differential geometry to numerical analysis has been very successful, thus far this link has been explored only for a small number of differential geometry ideas. In this project, the investigators will continue exploring this link, taking more ideas from differential geometry and applying them to develop new numerical algorithms. These algorithms could then be used both in applied areas, by solving PDEs in science and engineering, and in pure areas, by solving PDEs in differential geometry itself. The project will also support the training of graduate student researchers.
This project focuses on problems at the cusp of numerical analysis and differential geometry. It deals specifically with the design of finite element methods for PDEs that involve vector fields and tensor fields on Riemannian manifolds. In the long term, these efforts have the potential to lead to robust numerical methods for solving geometric PDEs like the Einstein field equations, which are useful for studying gravitational wave signals, as well as PDEs like the elasticity equations, which model how objects deform under stress. This project has three main goals. The first is to develop a new family of finite elements for discretizing algebraic curvature tensors and other bi-forms---tensor products of differential forms---on simplicial triangulations. The second goal is to develop an intrinsic finite element discretization of the Bochner Laplacian, which is a basic differential operator in Riemannian geometry that differs from the familiar Hodge Laplacian from finite element exterior calculus. The third goal is to leverage what we learn to design numerical methods for a wide range of geometric problems, such as computing spectra of elliptic operators on manifolds, simulating intrinsic geometric flows, and solving prescribed curvature problems.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2409918","Structure preservation in nonlinear, degenerate, evolution","DMS","COMPUTATIONAL MATHEMATICS","08/01/2024","06/03/2024","Abner Salgado","TN","University of Tennessee Knoxville","Standard Grant","Ludmil T. Zikatanov","07/31/2027","$204,533.00","","asalgad1@utk.edu","201 ANDY HOLT TOWER","KNOXVILLE","TN","379960001","8659743466","MPS","127100","9263","$0.00","A thorough treatment is feasible for the classical linear problems in the numerical approximation of partial differential equations. The continuous problem is well-posed. The numerical schemes are well-posed, parameter-robust, and convergent. It is even possible to prove convergence rates. However, the situation is more precarious for modern, complex systems of equations. Oftentimes, the uniqueness of solutions is not known. Even when there is uniqueness, the theory is far from complete, and so besides (weak) convergence of numerical solutions, little can be said about their behavior. In these scenarios, one must settle for simpler yet still relevant goals. An important goal in this front is that of structure preservation. The study of structure preservation in numerical methods is not new. Geometric numerical integration, many methods for electromagnetism, the finite element exterior calculus, and some novel approaches to hyperbolic systems of conservation laws, have this goal in mind: geometric, algebraic, or differential constraints must be preserved. This project does not focus on the problems mentioned above. Instead, it studies structure preservation in some evolution problems that have, possibly degenerate, diffusive behavior. This class of problems remains a largely unexplored topic when it comes to numerical discretizations. Bridging this gap will enhance modeling and prediction capabilities since diffusive models can be found in every aspect of scientific inquiry.
This project is focused on a class of diffusive problems in which stability of the solution cannot be obtained by standard energy arguments, in other words, by testing the equation with the solution to assert that certain space-time norms are under control. Norms are always convex. Structure preservation may then be a generalization of the approach given above. Instead of norms being under control, a (family of) convex functional(s) evaluated at the solution behave predictably during the evolution. The project aims to develop numerical schemes that mimic this in the discrete setting. While this is a largely unexplored topic, at the same time, many of the problems under consideration can be used to describe a wide range of phenomena. In particular, the project will develop new numerical schemes for an emerging theory of non-equilibrium thermodynamics, active scalar equations, and a class of problems in hyperbolic geometry. These models have a very rich intrinsic structure and a wide range of applications, and the developments of this project will serve as a stepping stone to bring these tools to the numerical treatment of more general problems. The students involved in the project will be trained in exciting, mathematically and computationally challenging, and practically relevant areas of research.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2410677","Collaborative Research: Data-driven Realization of State-space Dynamical Systems via Low-complexity Algorithms","DMS","COMPUTATIONAL MATHEMATICS","08/01/2024","06/07/2024","Kshitij Khare","FL","University of Florida","Standard Grant","Jodi Mead","07/31/2027","$89,853.00","","kdkhare@stat.ufl.edu","1523 UNION RD RM 207","GAINESVILLE","FL","326111941","3523923516","MPS","127100","079Z, 9263","$0.00","Data science is evolving rapidly and places a new perspective on realizing state-space dynamical systems. Predicting time-advanced states of dynamical systems is a challenging problem in STEM disciplines due to their nonlinear and complex nature. This project will utilize data-driven methods and analyze state-space dynamical systems to predict and understand future states, surpassing classical techniques. In addition, the PI team will (i) guide students to obtain cross-discipline PhD/Master's degrees, (ii) guide students to work in a peer-learning environment, and (iii) educate a diverse group of undergraduates.
In more detail, this project will utilize state-of-the-art machine learning (ML) algorithms to efficiently analyze and predict information within data matrices and tensor computations with low-complexity algorithms. Single-dimensional ML models are not efficient at extracting hidden semantic information in the time and space domains. As a result, it becomes challenging to simultaneously capture multi-dimensional spatiotemporal data in state-space dynamical systems. Using efficient ML algorithms to recover multi-dimensional spatiotemporal data simultaneously offers a breakthrough in understanding the chaotic behavior of dynamical systems. This project will (i) utilize ML to predict future states of dynamical systems based on high-dimensional data matrices captured at different time stamps, (ii) realize state-space controllable and observable systems via low-complexity algorithms to simultaneously analyze multiple states of the systems, (iii) analyze noise in state-space systems for uncertainty quantification, predict patterns in real-time states, generate counter-resonance states to suppress them, and optimize performance and stability, (iv) study system resilience via multiple state predictors and perturbations to assess performance and adaptation to disturbances and anomalies, and finally (v) optimize spacecraft trajectories, avoid impact, and use low-complexity algorithms to understand spacecraft launch dynamics on the space coast and support ERAU's mission in aeronautical research.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
-"2411208","Collaborative Research: Numerical Methods and Differential Geometry","DMS","COMPUTATIONAL MATHEMATICS","06/15/2024","06/04/2024","Evan Gawlik","HI","University of Hawaii","Standard Grant","Yuliya Gorb","05/31/2027","$190,527.00","","egawlik@hawaii.edu","2425 CAMPUS RD SINCLAIR RM 1","HONOLULU","HI","968222247","8089567800","MPS","127100","9150, 9263","$0.00","Partial differential equations (PDEs) model a wide variety of phenomena, ranging from how an airplane wing deforms in response to turbulence, to how radio waves travel through and around objects, to how black holes generate gravitational waves when they merge. Numerical analysts develop algorithms for simulating these systems by solving PDEs on a computer; these simulations enable engineers and scientists to develop prototypes and to interpret data from sensors. For example, the NSF-funded Nobel-winning detection of gravitational waves would not have been possible without advances in numerical analysis. In recent decades, numerical analysts discovered that ideas from differential geometry, an area of pure mathematics, can be used to develop good algorithms for solving PDEs. In fact, these ideas help not only for geometric problems in fields of study like computer vision and general relativity, but also for fields like electromagnetism that have little to do with geometry. Although applying differential geometry to numerical analysis has been very successful, thus far this link has been explored only for a small number of differential geometry ideas. In this project, the investigators will continue exploring this link, taking more ideas from differential geometry and applying them to develop new numerical algorithms. These algorithms could then be used both in applied areas, by solving PDEs in science and engineering, and in pure areas, by solving PDEs in differential geometry itself. The project will also support the training of graduate student researchers.
This project focuses on problems at the cusp of numerical analysis and differential geometry. It deals specifically with the design of finite element methods for PDEs that involve vector fields and tensor fields on Riemannian manifolds. In the long term, these efforts have the potential to lead to robust numerical methods for solving geometric PDEs like the Einstein field equations, which are useful for studying gravitational wave signals, as well as PDEs like the elasticity equations, which model how objects deform under stress. This project has three main goals. The first is to develop a new family of finite elements for discretizing algebraic curvature tensors and other bi-forms---tensor products of differential forms---on simplicial triangulations. The second goal is to develop an intrinsic finite element discretization of the Bochner Laplacian, which is a basic differential operator in Riemannian geometry that differs from the familiar Hodge Laplacian from finite element exterior calculus. The third goal is to leverage what we learn to design numerical methods for a wide range of geometric problems, such as computing spectra of elliptic operators on manifolds, simulating intrinsic geometric flows, and solving prescribed curvature problems.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2410893","Unsteady Reynolds averaged Navier-Stokes models and computational fluid dynamics","DMS","COMPUTATIONAL MATHEMATICS","08/01/2024","06/07/2024","William Layton","PA","University of Pittsburgh","Standard Grant","Ludmil T. Zikatanov","07/31/2027","$220,000.00","","wjl+@pitt.edu","4200 FIFTH AVENUE","PITTSBURGH","PA","152600001","4126247400","MPS","127100","9263","$0.00","The project will conduct research on the numerical solution of turbulent ?ows. Fluids transport and mix heat, chemical species, and contaminants. Accurate simulation of turbulent flow is essential for safety critical prediction and design in applications involving these and other e?ects. Turbulent ?ow prediction in science , engineering and industry requires the use of turbulence models. The research project has 3 objectives: increasing accuracy of these models, decreasing model complexity and exploring a promising algorithmic idea for computer solution of models. The proposed research also develops the expertise of graduate students in computational and applied mathematics while working on compelling problems addressing human needs. In their development into independent scientists, each student will develop their own research agenda and collaborate at points of contact among the problems studied.
Modeling turbulence presents challenges at every level in every discipline it touches. 2-equation Unsteady Reynolds Averaged Navier-Stokes models are common in applications and also the ones with the most incomplete mathematical foundation. They have many calibration parameters, work acceptably for ?ows similar to the calibration data set and require users to have an intuition about which model predictions to accept and which to ignore. The project?s model analysis will address model accuracy, complexity and reliability. Even after modeling, greater computational resources are often required for their computational solution. In 1991 Ramshaw and Mesina proposed a non-obvious synthesis of penalty and arti?cial compression methods, resulting in a dispersive regularization of ?uid motion. When the two effects were balanced, they reported a dramatic accuracy improvement over the most e?cient current methods. The project will develop, improve and test the method based on a new analysis of energy ?ow.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2410645","Computational Methods for Inverse and Optimal Design Problems in Topological Wave Insulators Based on Constructive Analysis","DMS","COMPUTATIONAL MATHEMATICS, EPSCoR Co-Funding","07/01/2024","06/04/2024","Junshan Lin","AL","Auburn University","Standard Grant","Troy D. Butler","06/30/2027","$300,263.00","","jzl0097@auburn.edu","321-A INGRAM HALL","AUBURN","AL","368490001","3348444438","MPS","127100, 915000","9150, 9263","$0.00","Topological wave insulators are a specialized material for transporting wave energy in various applications in modern science and engineering. This project will develop computational methods for several classes of inverse and optimal problems arising from the mathematical studies of partial differential equation (PDE) models for topological wave insulators. The goals of this project are to provide efficient computational algorithms that address several theoretical open questions in this area. Successful completion of this project should stimulate the mathematical research for topological insulators and beyond. The developed computational frameworks will also provide physical experimentalists and engineers with the computational tools to improve the performance and functionalities of topological materials. The project will also integrate students into the research team as part of their professional training.
The project will address several key scientific challenges arising from the inverse and optimal design of the spectrum of the PDE operators in topological wave insulators. First, based on the spectral analysis of the PDE operators in periodic media, a new optimization framework through the enforcement of parity for the eigenfunctions will be built to solve for wave insulators that attain Dirac points at desired Bloch wave vectors and eigenfrequencies. Numerical algorithms based on the construction of wave propagators in periodic media and the design of the spectral indicator function will be developed to efficiently identify the interface parameters that allow for the existence of edge modes in joint topological wave insulators. Finally, efficient convex semidefinite programming based numerical methods will be developed for solving the optimization problems that arise from maximizing the band gaps of the PDE operators for topological wave insulators in order to enlarge the spectral bandwidth of edge modes. This project is jointly funded by the Computational Mathematics and the Established Program to Stimulate Competitive Research (EPSCoR).
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2411229","Taming nonlinearity in PDE systems using lifted Newton nonlinear preconditioning","DMS","COMPUTATIONAL MATHEMATICS","09/01/2024","06/04/2024","Georg Stadler","NY","New York University","Standard Grant","Yuliya Gorb","08/31/2027","$399,998.00","","stadler@courant.nyu.edu","70 WASHINGTON SQ S","NEW YORK","NY","100121019","2129982121","MPS","127100","9263","$0.00","Many important questions in the natural sciences and in engineering involve nonlinear phenomena, mathematically described by nonlinear equations. Solving these problems typically requires iterative algorithms like Newton's method, which linearizes the nonlinear problem in each iteration. Newton's method is known for its rapid local convergence. However, the convergence theory only applies when the initialization is (very) close to the unknown solution. Thus, relying on local convergence theory is often impractical. Farther from the solution, small Newton updates are typically necessary to prevent divergence, leading to slow overall convergence. This project aims to develop better nonlinear solvers. This will benefit outer-loop problems, such as parameter estimation, learning, control, or design problems, which typically require solving many nonlinear (inner) problems. The project will also support the training and research of at least one graduate student, the mentoring of undergraduate students through the Courant?s Summer Undergraduate Research Experience (SURE) program, and the outreach to K-12 students through the cSplash activity in New York City.
To address issues of slow nonlinear convergence, This project aims to develop methods that lift the nonlinear system to a higher-dimensional space, enabling the application of nonlinear transformations that can mitigate nonlinearity before Newton linearization. The project will develop and systematically study the resulting novel Newton methods for severely nonlinear systems of partial differential equations (PDEs). The proposed lifting and transformation method can be interpreted as nonlinear preconditioning, a research area much less developed than preconditioning for linear systems. The goal of this project is to study for which classes of nonlinear PDE problems this approach improves convergence, to theoretically analyze why, and to make these methods a more broadly accessible tool for solving severely nonlinear systems.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
@@ -43,11 +44,11 @@
"2410686","Modeling, discretizations, and solution strategies for multiphysics problems","DMS","COMPUTATIONAL MATHEMATICS","07/01/2024","05/29/2024","Ivan Yotov","PA","University of Pittsburgh","Standard Grant","Yuliya Gorb","06/30/2027","$420,000.00","","yotov@math.pitt.edu","4200 FIFTH AVENUE","PITTSBURGH","PA","152600001","4126247400","MPS","127100","9263","$0.00","The goal of this project is to advance the state-of-the-art in modeling and computation of multiphysics systems that model the physical interactions between two or more media, such as couplings of fluid flows, rigid or deformable porous media, and elastic structures. Typical examples are coupling of free fluid and porous media flows, fluid-structure interaction, and fluid-poroelastic structure interaction (FPSI). The developed methods will be employed for several biomedical and geoscience applications. Biomedical applications include investigation of non-Newtonian and poroelastic effects in arterial flows on important clinical markers such as wall shear stress and relative residence time, modeling LDL transport and drug delivery in blood flows, as well as flows in the eye and the brain. Geoscience applications include tracing organic and inorganic contaminants in coupled surface-subsurface hydrological systems, predicting hydrocarbon extraction in hydraulic fracturing, geothermal energy production, and modeling the effect of proppant particles in injected polymers on the fracture width and flow patterns. While focused on FPSI, the developments in this project will be applicable to modeling and computation of a wide class of multiphysics problems with a broad range of applications.
The project consists of a comprehensive program for mathematical and computational modeling of multiphysics problems that includes 1) development and analysis of new mathematical models, 2) design and analysis of stable, accurate, and robust structure-preserving numerical methods, 3) development and analysis of efficient time-splitting and multiscale domain decomposition algorithms for the solution of the resulting algebraic problems, and 4) applications to the geosciences and biomedicine. Variational formulations of new fluid--poroelastic structure interaction (FPSI) models based on Navier-Stokes - Biot couplings will be developed, extending current model capabilities to flows with higher Reynolds numbers. Fully coupled nonlinear FPSI-transport models, including miscible displacement models with concentration-dependent fluid viscosity, stress-dependent diffusion, and non-isothermal models will also be studied. Novel discretization techniques will be investigated for the numerical approximation of the FPSI models. The focus will be on dual mixed and total pressure discretizations with local conservation of mass and momentum, accurate approximations with continuous normal components for velocities and stresses, and robustness with respect to physical parameters. These include multipoint stress-flux mixed finite element methods and local-stress mimetic finite difference methods that can be reduced to positive definite cell-centered schemes. Efficient multiscale domain decomposition and time-splitting algorithms will be developed for the solution of the resulting algebraic systems. The domain decomposition methodology will be based on space-time variational formulations and will allow for multiple subdomains within each region with non-matching grids along subdomain interfaces and local time-stepping. The convergence of the space-time coarse-scale mortar interface iteration will be studied by analyzing the spectrum of the interface operator. Iterative and non-iterative time-splitting methods will also be investigated.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2408912","Equity Beyond the Algorithm: A Mathematical Quest for Fairer-ness in Machine Learning","DMS","COMPUTATIONAL MATHEMATICS","07/01/2024","05/29/2024","Deanna Needell","CA","University of California-Los Angeles","Standard Grant","Troy D. Butler","06/30/2027","$275,000.00","","deanna@math.ucla.edu","10889 WILSHIRE BLVD STE 700","LOS ANGELES","CA","900244200","3107940102","MPS","127100","075Z, 079Z, 9263","$0.00","While machine learning (ML) and artificial intelligence (AI) are seeing widespread and rapid use across the world, very little is understood about many of their underlying mechanisms, and especially those revolving around fairness and bias. More examples are being reported every day that range from racist outputs of ChatGPT to imaging AI that predicts former president Barack Obama's face to be white. The mathematical community has fallen behind the rush to use ML and AI, yet mathematics is at the heart of the algorithmic designs and mechanisms behind ML and AI. This project will study fairness in ML and AI from several angles. First, it will create a framework that identifies fairness metrics throughout the algorithmic pipelines. Second, it will develop technologies to mitigate biases and improve fairness. Third, it will develop mathematical foundations to help us understand the mechanisms at work inside of many of these so-called black-box methods. In addition, medical and social justice applications will be integrated throughout the project, helping many nonprofits with high data driven needs meet their goals. These include medical applications helping to understand manifestations of Lyme disease as well as tools to help Innocence projects that work to free innocent people from prison, make appeal decisions, and synthesize case files. This synergistic approach both serves the community while also allowing those applications to fuel motivation for new and better mathematics. In addition, students will be integrated within the research team as part of their training.
Although ML and AI methods have expanded by leaps and bounds, there are still critical issues around fairness and bias that remain unresolved. The focus of this project consists of two main goals. First, it will create a framework where ML and AI methods generate informative descriptions about fairness across population groups. Subsequently, a mechanism will be applied based on this assessment to promote fairness across the population. This direction will both establish a structured framework for researchers and practitioners to report fairness metrics and emphasize their significance, while also enabling algorithms to adjust for fairness. The majority of the first goal revolves around showcasing this framework in ML applications including dimension reduction, topic modeling, classification, clustering, data completion, and prediction modeling. Second, the project will provide foundational mathematical support for more complex, seemingly opaque techniques such as neural networks and large language models. This includes the investigation of mathematically tangible shallow networks to understand their behavior in benign and non-benign overfitting. The project will also analyze the geometry of embeddings derived from large language models using a linear algebraic topic modeling approach, which is tied to the first goal. Applications with nonprofit community partners will be included throughout the duration of the project, including those in medicine and criminal and social justice. In total, successful completion of the proposed work will provide a pivotal step towards creating a more equitable and mathematically grounded machine learning landscape.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2411198","Collaborative Research: Randomized algorithms for dynamic and hierarchical Bayesian inverse problems","DMS","COMPUTATIONAL MATHEMATICS","08/01/2024","05/30/2024","Arvind Saibaba","NC","North Carolina State University","Standard Grant","Troy D. Butler","07/31/2027","$170,000.00","","asaibab@ncsu.edu","2601 WOLF VILLAGE WAY","RALEIGH","NC","276950001","9195152444","MPS","127100","9263","$0.00","Inverse problems appear in a diverse array of applications - in medical imaging, for X-ray computerized tomography, ultrasound, and magnetic resonance imaging; in geophysics, for atmospheric tomography, electrical resistivity tomography, seismic tomography, and weather prediction; in material sciences, for X-ray ptychography; in homeland security applications, for luggage scanning; in astrophysics, to image black holes and cosmic background estimation. The main goal of solving inverse problems is to use measurements to estimate the parameters of physical models. Being able to solve inverse problems efficiently, accurately, and with quantifiable certainty remains an open challenge. Randomized algorithms have made several advances in numerical linear algebra due to their ability to dramatically reduce computational costs without significantly compromising the accuracy of the computations. However, there is a rich and relatively unexplored field of research that lies between randomized numerical linear algebra and inverse problems, in particular for dynamic and hierarchical problems, where randomization can and should be exploited in unique ways. This project will address fundamental issues in the development of computationally efficient solvers for inverse problems and uncertainty quantification. The project will also train graduate students on start-of-the-art randomized algorithms.
The project will develop new and efficient randomized algorithms for mitigating the computational burdens of two types of inverse problems: hierarchical Bayesian inverse problems and dynamical inverse problems. The two main thrusts of this project are (i) to develop efficient algorithms to quantify the uncertainty of the hyperparameters that govern Bayesian inverse problems, and (ii) to develop new iterative methods that leverage randomization to efficiently approximate solutions and enable uncertainty quantification for large-scale inverse problems. This project will advance knowledge in the field of randomized algorithms for computational inverse problems and uncertainty quantification. It will also create numerical methods that are expected to be broadly applicable to many areas of science and engineering.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2410140","Manifold learning in Wasserstein space using Laplacians: From graphs to submanifolds","DMS","COMPUTATIONAL MATHEMATICS","08/01/2024","05/29/2024","Caroline Moosmueller","NC","University of North Carolina at Chapel Hill","Standard Grant","Troy D. Butler","07/31/2027","$419,962.00","Shiying Li","cmoosm@unc.edu","104 AIRPORT DR STE 2200","CHAPEL HILL","NC","275995023","9199663411","MPS","127100","079Z, 9263","$0.00","Manifold learning algorithms are tools used to reveal the underlying structure of high-dimensional datasets. This can be achieved by finding a lower-dimensional representation of the dataset, thereby enhancing the efficiency of subsequent data analysis. They find applications across various fields such as single-cell analysis, natural language processing, and neuroscience. While most existing algorithms are designed for datasets represented in vector spaces, real-world data often comprises distributions or point-clouds, presenting both theoretical and computational challenges for manifold learning algorithms. This project will develop manifold learning algorithms tailored for distributional or point-cloud datasets, with a particular emphasis on theoretical analysis and computational efficiency. Leveraging the framework of optimal transport and established manifold learning theory in vector spaces, the project will address these challenges. This project will also train students in interdisciplinary aspects of the research.
This project will develop and analyze algorithms for uncovering low-dimensional intrinsic structures of data sets within Wasserstein space, a natural space for distributions or point-clouds. This is motivated by the recent success in representing data as elements in Wasserstein space, as opposed to Euclidean space, and the necessity to develop efficient algorithms for their analysis. To accomplish the goals of this project, the research team will leverage the eigenvectors of a Laplacian matrix built from a data-dependent graph. Specifically, consistency theory of operators such as the Laplacian between the discrete (graph) and the continuous (submanifold) setting will be developed, drawing inspiration from the well-established theory for finite-dimensional Riemannian manifolds. The project will develop theoretically provable methods that provide algorithmic insights, which in turn can be used for efficient algorithms. The aims are threefold: (1) define dimensionality reduction algorithms for point-cloud data that can uncover curved submanifolds through suitable embeddings, (2) provide theoretical guarantees for these embeddings, and (3) design efficient algorithms for applications in high-dimensional settings such as single-cell data analysis.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2411141","Nonlinear Eigenvalue Problems: Building a New Paradigm Through the Lens of Systems Theory and Rational Interpolation","DMS","COMPUTATIONAL MATHEMATICS","08/01/2024","05/29/2024","Serkan Gugercin","VA","Virginia Polytechnic Institute and State University","Continuing Grant","Jodi Mead","07/31/2027","$89,958.00","Mark Embree","gugercin@math.vt.edu","300 TURNER ST NW","BLACKSBURG","VA","240603359","5402315281","MPS","127100","9263","$0.00","When building new devices or products, engineers naturally want to optimize performance with respect to some design variables; this process typically involves simulation with large-scale mathematical models. One desirable goal of this optimization is to maximize the stability of a system, to avoid designs for which small disturbances can get magnified until failure occurs. This project will study new approaches for assessing such stability, including a technique for simultaneously analyzing an entire ensemble of systems across a range of design variables, rather than analyzing individual systems one at a time. These techniques involve the symbiotic interplay of data and mathematical models. The project will involve graduate student training and professional development through summer research and capstone projects for Virginia Tech?s Computational Modeling and Data Analytics major.
Nonlinear eigenvalue problems (NLEVPs) arise naturally in many applications throughout science and engineering, from networks of vibrating structures to dynamical systems with time delays. In contrast to the linear eigenvalue problem, algorithms for solving NLEVPs remain in an unsettled state due to the fundamental challenges these problems pose. This project approaches NLEVPs through the lens of control theory, identifying contour-based eigenvalue algorithms as examples of system realization techniques. Given this perspective, this research program seeks to develop robust, reliable algorithms and software for NLEVPs, with an eye toward optimal parameter selection and efficiency for large-scale problems. This analysis and computational methods will be extended to handle parameter-dependent NLEVPs, where the problem varies based on one or more physical parameters. The project will also look in the opposite direction, using contour integral algorithms from eigenvalue computations to offer new approaches to data-driven modeling of dynamical systems.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2410699","Collaborative Research: Memory-aware Accelerated Solvers for Nonlinear Problems in High Dimensional Imaging Applications","DMS","COMPUTATIONAL MATHEMATICS","07/01/2024","05/30/2024","Mirjeta Pasha","VA","Virginia Polytechnic Institute and State University","Standard Grant","Troy D. Butler","06/30/2027","$162,490.00","","mpasha@vt.edu","300 TURNER ST NW","BLACKSBURG","VA","240603359","5402315281","MPS","127100","9263","$0.00","Achieving significant advances over the state-of-the-art methods for imaging applications such as reconstruction and high-dimensional image/video compression requires fundamentally new approaches to keep up with the volumes of (multi-dimensional and possibly streaming) data that have become pervasive in science and engineering applications. Consider, for instance, the need to continuously monitor, diagnose, and visualize anomalies within the body through modern magnetic resonance (MR) and computerized tomography (CT) scans, to inspect objects at crowded checkpoints, or check surveillance video for possible threats. These applications present a common challenge: the need to process an ever-increasing amount of data quickly and accurately to enable real-time decisions at a low computational cost while respecting limited memory capacities. This collaborative project will address these challenges through an innovative, multi-pronged, mathematical and algorithmic framework that capitalizes on properties inherent in the data as well as on features in the solutions (i.e. images, video frames) that persist over time and/or as the solutions are being resolved. The work produced will have broad scientific impact: for example, the newly offered speed of the image reconstruction methods may improve the ability to detect anomalies in tissue, underground or in luggage, while the compression algorithms hold promise for other disciplines where the ability to compress and explain multi-way (a.k.a tensor) data is paramount, such as satellite imaging, biology, and data science. Graduate students will be trained as part of this project.
Digital images and video are inherently multi-way objects. A single, grayscale, digital image is a two-dimensional array of numbers with the numbers coded to appear as shades of gray whereas a collection of such grayscale images, such as video frames, are three-way arrays, also called third order tensors. The benefit of tensor compression (or completion, if some image values are missing or obscured) techniques in terms of quality over more traditional matrix-based methods merit their use. Reconstructing images that preserve edges is also of paramount importance: consider that an image edge defines the boundary between a tumor and normal tissue, for instance. This project will focus on these two distinct imaging problems, edge-based reconstruction and compressed tensor data representation, whose solution requires memory-efficient iterative approaches, but for which the state-of-the-art iterative techniques are slow to converge and memory intensive. The acceleration will be achieved by a combination of judicious choice of limited memory recycled subspaces, classical acceleration approaches (e.g., NGMRES or Anderson Acceleration), and operator approximation. Furthermore, if the data arrives asynchronously or the regularized problem cannot all fit into memory at once, the method will extend to streamed-recycling. The streamed-recycling approach will break the problem up into memory-manageable chunks while keeping a small-dimensional subspace that encodes and retains the most important features to enable solution to the original, large-scale problem. The impact of the accelerated edge-preserving image reconstruction algorithms will be demonstrated on X-ray CT, but the algorithms will have much wider applicability in other imaging modalities.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2411197","Collaborative Research: Randomized algorithms for dynamic and hierarchical Bayesian inverse problems","DMS","COMPUTATIONAL MATHEMATICS","08/01/2024","05/30/2024","Julianne Chung","GA","Emory University","Standard Grant","Troy D. Butler","07/31/2027","$170,000.00","","jmchung@emory.edu","201 DOWMAN DR NE","ATLANTA","GA","303221061","4047272503","MPS","127100","9263","$0.00","Inverse problems appear in a diverse array of applications - in medical imaging, for X-ray computerized tomography, ultrasound, and magnetic resonance imaging; in geophysics, for atmospheric tomography, electrical resistivity tomography, seismic tomography, and weather prediction; in material sciences, for X-ray ptychography; in homeland security applications, for luggage scanning; in astrophysics, to image black holes and cosmic background estimation. The main goal of solving inverse problems is to use measurements to estimate the parameters of physical models. Being able to solve inverse problems efficiently, accurately, and with quantifiable certainty remains an open challenge. Randomized algorithms have made several advances in numerical linear algebra due to their ability to dramatically reduce computational costs without significantly compromising the accuracy of the computations. However, there is a rich and relatively unexplored field of research that lies between randomized numerical linear algebra and inverse problems, in particular for dynamic and hierarchical problems, where randomization can and should be exploited in unique ways. This project will address fundamental issues in the development of computationally efficient solvers for inverse problems and uncertainty quantification. The project will also train graduate students on start-of-the-art randomized algorithms.
The project will develop new and efficient randomized algorithms for mitigating the computational burdens of two types of inverse problems: hierarchical Bayesian inverse problems and dynamical inverse problems. The two main thrusts of this project are (i) to develop efficient algorithms to quantify the uncertainty of the hyperparameters that govern Bayesian inverse problems, and (ii) to develop new iterative methods that leverage randomization to efficiently approximate solutions and enable uncertainty quantification for large-scale inverse problems. This project will advance knowledge in the field of randomized algorithms for computational inverse problems and uncertainty quantification. It will also create numerical methods that are expected to be broadly applicable to many areas of science and engineering.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2410698","Collaborative Research: Memory-aware Accelerated Solvers for Nonlinear Problems in High Dimensional Imaging Applications","DMS","COMPUTATIONAL MATHEMATICS","07/01/2024","05/30/2024","Misha Kilmer","MA","Tufts University","Standard Grant","Troy D. Butler","06/30/2027","$162,510.00","","misha.kilmer@tufts.edu","169 HOLLAND ST","SOMERVILLE","MA","021442401","6176273696","MPS","127100","9263","$0.00","Achieving significant advances over the state-of-the-art methods for imaging applications such as reconstruction and high-dimensional image/video compression requires fundamentally new approaches to keep up with the volumes of (multi-dimensional and possibly streaming) data that have become pervasive in science and engineering applications. Consider, for instance, the need to continuously monitor, diagnose, and visualize anomalies within the body through modern magnetic resonance (MR) and computerized tomography (CT) scans, to inspect objects at crowded checkpoints, or check surveillance video for possible threats. These applications present a common challenge: the need to process an ever-increasing amount of data quickly and accurately to enable real-time decisions at a low computational cost while respecting limited memory capacities. This collaborative project will address these challenges through an innovative, multi-pronged, mathematical and algorithmic framework that capitalizes on properties inherent in the data as well as on features in the solutions (i.e. images, video frames) that persist over time and/or as the solutions are being resolved. The work produced will have broad scientific impact: for example, the newly offered speed of the image reconstruction methods may improve the ability to detect anomalies in tissue, underground or in luggage, while the compression algorithms hold promise for other disciplines where the ability to compress and explain multi-way (a.k.a tensor) data is paramount, such as satellite imaging, biology, and data science. Graduate students will be trained as part of this project.
Digital images and video are inherently multi-way objects. A single, grayscale, digital image is a two-dimensional array of numbers with the numbers coded to appear as shades of gray whereas a collection of such grayscale images, such as video frames, are three-way arrays, also called third order tensors. The benefit of tensor compression (or completion, if some image values are missing or obscured) techniques in terms of quality over more traditional matrix-based methods merit their use. Reconstructing images that preserve edges is also of paramount importance: consider that an image edge defines the boundary between a tumor and normal tissue, for instance. This project will focus on these two distinct imaging problems, edge-based reconstruction and compressed tensor data representation, whose solution requires memory-efficient iterative approaches, but for which the state-of-the-art iterative techniques are slow to converge and memory intensive. The acceleration will be achieved by a combination of judicious choice of limited memory recycled subspaces, classical acceleration approaches (e.g., NGMRES or Anderson Acceleration), and operator approximation. Furthermore, if the data arrives asynchronously or the regularized problem cannot all fit into memory at once, the method will extend to streamed-recycling. The streamed-recycling approach will break the problem up into memory-manageable chunks while keeping a small-dimensional subspace that encodes and retains the most important features to enable solution to the original, large-scale problem. The impact of the accelerated edge-preserving image reconstruction algorithms will be demonstrated on X-ray CT, but the algorithms will have much wider applicability in other imaging modalities.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
-"2410140","Manifold learning in Wasserstein space using Laplacians: From graphs to submanifolds","DMS","COMPUTATIONAL MATHEMATICS","08/01/2024","05/29/2024","Caroline Moosmueller","NC","University of North Carolina at Chapel Hill","Standard Grant","Troy D. Butler","07/31/2027","$419,962.00","Shiying Li","cmoosm@unc.edu","104 AIRPORT DR STE 2200","CHAPEL HILL","NC","275995023","9199663411","MPS","127100","079Z, 9263","$0.00","Manifold learning algorithms are tools used to reveal the underlying structure of high-dimensional datasets. This can be achieved by finding a lower-dimensional representation of the dataset, thereby enhancing the efficiency of subsequent data analysis. They find applications across various fields such as single-cell analysis, natural language processing, and neuroscience. While most existing algorithms are designed for datasets represented in vector spaces, real-world data often comprises distributions or point-clouds, presenting both theoretical and computational challenges for manifold learning algorithms. This project will develop manifold learning algorithms tailored for distributional or point-cloud datasets, with a particular emphasis on theoretical analysis and computational efficiency. Leveraging the framework of optimal transport and established manifold learning theory in vector spaces, the project will address these challenges. This project will also train students in interdisciplinary aspects of the research.
This project will develop and analyze algorithms for uncovering low-dimensional intrinsic structures of data sets within Wasserstein space, a natural space for distributions or point-clouds. This is motivated by the recent success in representing data as elements in Wasserstein space, as opposed to Euclidean space, and the necessity to develop efficient algorithms for their analysis. To accomplish the goals of this project, the research team will leverage the eigenvectors of a Laplacian matrix built from a data-dependent graph. Specifically, consistency theory of operators such as the Laplacian between the discrete (graph) and the continuous (submanifold) setting will be developed, drawing inspiration from the well-established theory for finite-dimensional Riemannian manifolds. The project will develop theoretically provable methods that provide algorithmic insights, which in turn can be used for efficient algorithms. The aims are threefold: (1) define dimensionality reduction algorithms for point-cloud data that can uncover curved submanifolds through suitable embeddings, (2) provide theoretical guarantees for these embeddings, and (3) design efficient algorithms for applications in high-dimensional settings such as single-cell data analysis.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2409858","Dynamical Low Rank Methods for Multiscale Kinetic Plasma Simulations","DMS","COMPUTATIONAL MATHEMATICS","08/01/2024","05/29/2024","Jingwei Hu","WA","University of Washington","Standard Grant","Ludmil T. Zikatanov","07/31/2027","$220,000.00","","hujw@uw.edu","4333 BROOKLYN AVE NE","SEATTLE","WA","981951016","2065434043","MPS","127100","9263","$0.00","Plasmas consist of many charged particles, such as electrons and ions. The Boltzmann equation is often regarded as the first-principle model for plasmas; however, its numerical simulation is prohibitively expensive even on today?s most powerful supercomputers. The challenges manifest as: 1) High-dimensionality. The Boltzmann equation resides in six-dimensional phase space. Hence, full 6D deterministic simulation requires excessive computing effort and memory. 2) Collision operator. Collisions between particles are described by nonlinear, nonlocal integral operators that are extremely difficult to approximate. Yet, they play a critical role in driving the system towards local thermodynamic equilibrium and must be included in the simulation, especially in transition and fluid regimes. 3) Multiple scales. Plasmas inherently exhibit multiscale physics. Different scaling can lead to different asymptotic models. How to conduct efficient kinetic simulations such that multiscale behaviors are properly captured is a long-standing problem. The overall objective of this project is to develop a set of ultra-efficient deterministic numerical methods for multiscale kinetic plasma simulations. The algorithms to be developed in this project have the potential to provide high-fidelity kinetic plasma simulations across a range of regimes at a manageable computational cost.
The basic framework we will employ is the dynamical low-rank method (DLRM), a robust dimension reduction technique for solving high-dimensional partial differential equations. In essence, DLRM can be viewed as a time-dependent singular value decomposition; instead of solving the 6D equation, it tracks the dynamics of low-rank factors of the solution, which depend on either the three-dimensional position variable or the three-dimensional velocity variable, thus drastically reducing the computational cost and memory footprint. Our focus will be on the nonlinear collisional kinetic equations for plasmas, allowing us to address a broader range of regimes beyond the collisionless ones. We will design an efficient low-rank ansatz inspired by various asymptotic limits of plasma kinetic equations such that the method only requires a few ranks in the limiting regime and is as efficient as solving the reduced fluid models. We will also study the uniform stability and longtime behavior of DLRM rigorously, justifying the method's robustness for treating multiscale problems.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2411120","Improving quantum speedup for solving differential equations","DMS","COMPUTATIONAL MATHEMATICS","09/01/2024","05/30/2024","Xiantao Li","PA","Pennsylvania State Univ University Park","Continuing Grant","Troy D. Butler","08/31/2027","$59,632.00","","xli@math.psu.edu","201 OLD MAIN","UNIVERSITY PARK","PA","168021503","8148651372","MPS","127100","9263","$0.00","The ultimate challenge in many areas of applied science can be attributed to the limited capability of solving large-scale differential equations. Classical computers encounter a fundamental bottleneck due to the nonlinearity, vast number of degrees of freedom, and inherent stochasticity of these equations. Motivated by the emergence of quantum computing, which promises significant speedups over classical methods for many scientific computing problems, particularly those involving quantum dynamics governed by the Schrodinger equation, this research aims to establish an innovative mathematical framework. This framework will transform a broad range of differential equations into the Schrodinger equation, enabling the application of quantum algorithms. Such quantum speedup has the potential to enhance the prediction of physical properties and optimize system performance based on differential equation models. To ensure broader scientific and societal impacts, the research team will disseminate results at quantum information processing conferences and also integrate graduate students within the research plan as part of their professional training.
The principal investigator will develop an encoding scheme to represent large-scale differential equations within unitary dynamics through a shadow Hamiltonian. Using backward error analysis, the research aims to systematically construct a shadow Hamiltonian with an arbitrarily higher order of accuracy. Moreover, a precise procedure will be developed for mapping nonlinear and stochastic differential equations into such unitary evolution, significantly broadening the applicability of the proposed encoding scheme. The quantum algorithms derived from this project will be applied to non-Hermitian dynamics from topological materials and chemical Langevin dynamics from biomolecular modeling, aiming to make a direct impact on critical physics and engineering fields.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2409855","Metric-Dependent Strategies for Inverse Problem Analysis and Computation","DMS","COMPUTATIONAL MATHEMATICS","07/01/2024","05/29/2024","Yunan Yang","NY","Cornell University","Standard Grant","Troy D. Butler","06/30/2027","$275,000.00","","yunan.yang@cornell.edu","341 PINE TREE RD","ITHACA","NY","148502820","6072555014","MPS","127100","9263","$0.00","This project will develop novel approaches to solving inverse problems, which are pivotal in many scientific fields, including biology, geophysics, and medical imaging. Inverse problems often involve deducing unknown parameters from observed data, a task complicated by issues such as sensitivity to measurement noise and complex modeling procedures. The broader significance of this research lies in its potential to significantly enhance the accuracy and efficiency of computational methods used in critical applications such as electrical impedance tomography (EIT), inverse scattering, and cryo-electron microscopy (cryo-EM). For instance, improvements in cryo-EM computation will accelerate breakthroughs in molecular biology and aid in rapid drug development, directly benefiting medical research and public health. Additionally, this project will also (1) engage undergraduate and graduate students in research to foster a new generation of computational mathematicians, and (2) promote STEM careers among K-12 students through outreach activities.
The technical focus of this project will be on the development of metric-dependent strategies to improve the stability and computational efficiency of solving inverse problems. Lipschitz-type stability will be established by selecting metrics tailored to the data and unknown parameters to facilitate more robust algorithmic solutions. A key highlight of the project will be the investigation of the stochastic inverse problem's well-posedness. Sampling methods inspired by metric-dependent gradient flows will serve as the novel computational tool for the practical solution of stochastic inverse problems. These analytical and computational strategies will be designed to handle the randomness inherent in many practical scenarios, shifting the traditional deterministic approach for solving inverse problems to a probabilistic framework that better captures the intricacies of real-world data. This research has the promise to not only advance theoretical knowledge in studying inverse problems but also to develop practical, efficient tools for a wide range of applications in science and engineering.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
diff --git a/Mathematical-Biology/Awards-Mathematical-Biology-2024.csv b/Mathematical-Biology/Awards-Mathematical-Biology-2024.csv
index 9a949d0..3eb1b30 100644
--- a/Mathematical-Biology/Awards-Mathematical-Biology-2024.csv
+++ b/Mathematical-Biology/Awards-Mathematical-Biology-2024.csv
@@ -1,18 +1,22 @@
"AwardNumber","Title","NSFOrganization","Program(s)","StartDate","LastAmendmentDate","PrincipalInvestigator","State","Organization","AwardInstrument","ProgramManager","EndDate","AwardedAmountToDate","Co-PIName(s)","PIEmailAddress","OrganizationStreet","OrganizationCity","OrganizationState","OrganizationZip","OrganizationPhone","NSFDirectorate","ProgramElementCode(s)","ProgramReferenceCode(s)","ARRAAmount","Abstract"
+"2424855","eMB: Collaborative Research: Using mathematics to bridge between evolutionary dynamics in the hematopoietic systems of mice and humans: from in vivo to epidemiological scales","DMS","MATHEMATICAL BIOLOGY","10/01/2024","08/08/2024","Angela Fleischman","CA","University of California-Irvine","Standard Grant","Amina Eladdadi","09/30/2027","$116,105.00","","agf@uci.edu","160 ALDRICH HALL","IRVINE","CA","926970001","9498247295","MPS","733400","8038","$0.00","This project is a collaboration between three institutions: University of California-San Diego, Xavier University of Louisiana, and University of California-Irvine. The human blood contains different cell types that are continuously produced, while older cells die. As this process continues while the organism ages, mistakes are made during cell production, generating mutant cells. These mutants can linger in the blood and become more abundant over time. They can contribute to chronic health conditions and there is a chance that they initiate cancer. It is not well understood why these mutant cells persist and expand. One problem that has held back progress is that for obvious reasons it is impossible to perform experiments with human subjects to investigate this. Mathematics combined with epidemiological data, however, offers a way around this limitation. This project develops mathematical models describing the evolution of mutant cells in the blood over time, using experimental mouse data to define the model structure. New mathematical approaches are then used to adapt this model to the human blood system, by bridging between mathematical models of mutant evolution in the blood, and the epidemiological age-incidence of mutants in the human population. There is broad public health impact, since this work can suggest ways to reduce the mutant cells in patients, which can alleviate chronic health conditions and reduce cancer risk. From the educational perspective, the PIs collaborate with Xavier University of Louisiana, an undergraduate historically black university, to foster enthusiasm in continued education and careers in STEM, and equip students with knowledge and skills to potentially continue in graduate programs at top universities, thus promoting social mobility.
As higher organisms age, tissue cells acquire mutations that can rise in frequency over time. Such clonal evolutionary processes have been documented in many human tissues and have become a major focus for understanding the biology of aging. Gaining more insights into mechanisms that drive mutant emergence in non-malignant human tissues is an important biological / public health question that needs to be addressed to define correlates of tissue aging. While experiments in mice have suggested possible drivers of mutant evolution in tissues, a central unresolved question is whether (and how) knowledge from murine models can be applied to humans. Mathematics provides a new approach to address this challenge: We propose a multiscale approach that uses mathematics to bridge between cellular dynamics of mice and humans, by utilizing epidemiological data of mutant incidence in human populations. We use ?clonal hematopoiesis of indeterminate potential? (CHIP) as a study system, where TET2 and DNMT3A mutant clones emerge in the histologically normal hematopoietic system. Based on stem cell transplantation experiments in mice, we seek to construct a predictive mathematical model of mutant evolution in mice. Using the hazard function, this in vivo model can predict the epidemiological incidence of mutants. Fitting predicted to observed mutant age-incidence data for humans will yield a parameterized and predictive model of human TET2 and DNMT3A mutant evolution. Public health impacts include a better understanding of mutant evolution in the human hematopoietic system, which may lead to evolution-based intervention strategies to reduce CHIP mutant burden.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2424684","eMB: Multi-state bootstrap percolation on digraphs as a framework for neuronal network analysis","DMS","IIBR: Infrastructure Innovatio, MATHEMATICAL BIOLOGY","09/01/2024","08/07/2024","Jonathan Rubin","PA","University of Pittsburgh","Standard Grant","Zhilan Feng","08/31/2027","$384,792.00","Gregory Constantine, Mohammad Amin Rahimian, Sabrina Streipert","jonrubin@pitt.edu","4200 FIFTH AVENUE","PITTSBURGH","PA","152600001","4126247400","MPS","084Y00, 733400","068Z, 8038, 8091","$0.00","To study the spread of disease or opinions, it has proven useful to consider a collection of interacting individuals as a mathematical structure called a graph. A graph consists of a set of nodes and a set of edges, each of which links a pair of nodes. PI will extend this framework to the study of neuronal activity. In the neuronal case, each node is a neuron, and edges between them represent the one-way or directed synaptic connections through which neurons interact. Moreover, to capture the different levels of activity that neurons exhibit, each node will be in one of three states, with state changes determined by interactions along edges. PI will develop novel mathematical methods to analyze the patterns of activity that emerge in such multi-state, directed graphs. These methods will allow the project team to study how activity spreads in networks of neurons, how this spread depends on the connection patterns in the network, and what properties characterize sets of nodes that are especially effective at spreading activity. The research results will generate novel predictions about the properties and function of networks of neurons in the brain, with a focus on the key networks that drive breathing and other essential, rhythmic behaviors. The project will train students, will generate openly available computer code, and will include public outreach through online videos and a summer math program for girls.
Functional outputs from brain circuits driving certain critical, rhythmic behaviors require the widespread emergence of an elevated, bursting state of neuronal activity. The main goal of this project is to advance knowledge about how the localized initiation of activity can rapidly evolve into widespread bursting in synaptically coupled neuronal networks, exemplified by the respiratory brainstem circuit. The project team will achieve this goal by representing such networks as digraphs in which each node can assume one of three possible activity states ? inactive, weakly active, and fully active or bursting -- with updates based on in-neighbors? states. Little theory exists to characterize this multi-state bootstrap percolation framework, and we will develop new analytical approaches involving mean field and master equations, asymptotic and probabilistic estimates, and graph design based on combinatorial principles. The analysis will address differences between dynamics in this framework and that in bootstrap percolation with only two possible states per node, the impact of global graph properties on activity spread, and the characteristics of local initiation sites that result in especially effective activity propagation. Overall, the project will support a new interdisciplinary collaboration and the completed work, involving trainee mentorship and open sharing of code, will provide important insights and predictions about neuronal dynamics and interactions as well as a range of mathematical advances.
This project is jointly funded by the Mathematical Biology Program in the Division of Mathematical Sciences and the Research Resources Cluster in the Division of Biological Infrastructure in the Directorate for Biological Sciences.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2424004","eMB: Explainable and Physics-Informed Machine Learning for Cell Typing via a Modern Optimization Lens","DMS","OFFICE OF MULTIDISCIPLINARY AC, Innovation: Bioinformatics, MATHEMATICAL BIOLOGY","09/01/2024","08/05/2024","Can Li","IN","Purdue University","Standard Grant","Amina Eladdadi","08/31/2027","$376,162.00","Xiaoping Bao","canli@purdue.edu","2550 NORTHWESTERN AVE # 1100","WEST LAFAYETTE","IN","479061332","7654941055","MPS","125300, 164Y00, 733400","068Z, 075Z, 079Z, 8038","$0.00","Human-induced pluripotent stem cells (hiPSCs) represent a groundbreaking advancement in stem cell research. Derived from skin or blood cells, hiPSCs are reprogrammed to an embryonic-like state, enabling them to differentiate into any cell type, such as blood, immune, heart, and neuron cells. This Nobel Prize-winning technology circumvents the ethical issues associated with human embryonic stem cells and provides valuable models for studying human development, disease, drug testing, and potential cell-based therapies. However, to leverage hiPSCs in clinical settings and large-scale manufacturing, there are significant challenges to overcome. One major challenge is accurately identifying cell types at different stages of differentiation, which is crucial for ensuring the cells perform their intended functions. Traditional experimental methods for cell identification can be costly, time-consuming, and limited in robustness. This research aims to address these challenges by developing explainable and physics-informed machine learning models. These models will enhance the accuracy and reliability of cell type identification, ensuring that hiPSC technology can be widely adopted in clinical and industrial applications, ultimately benefiting society through improved healthcare solutions and advancing our understanding of human biology. The project will involve both graduate and undergraduate students, with graduate students focusing on core theory and method development while undergraduates investigate applications. The PIs will work with Purdue?s Research Experience for Undergraduates (REU), and Summer Vertically Integrated Projects (VIP) program to mentor additional underrepresented minority students each summer to work on interdisciplinary research in stem cell engineering and machine learning. Outreach activities will include developing hands-on K-12 activities, partnering with local organizations, organizing lab tours, and presenting research at the ""Mending Broken Hearts"" gallery exhibit, aiming to increase STEM participation among underrepresented groups.
This research project addresses critical challenges in the adoption and scalability of human-induced pluripotent stem cells (hiPSCs) by developing novel machine learning methodologies. The specific problems targeted include the need for high-accuracy, cost-effective cell type identification during differentiation and the incorporation of prior biological knowledge into explaining machine learning models. The PIs intend to create explainable machine learning algorithms that leverage single-cell RNA sequencing (scRNA-seq) and imaging data to provide counterfactual explanations, highlighting key genes or image features critical for cell typing. These models will utilize mixed-integer programming to solve counterfactual explanations to generate interpretable predictions, addressing the limitations of current black-box approaches. Additionally, the aim is to overcome data scarcity by integrating biological knowledge into the machine learning frameworks, employing novel physics-informed machine learning algorithms. This research will develop and benchmark these innovative methods, applying them to the study of Tumor Associated Neutrophils (TANs) for cancer therapy. By enhancing explainability in cell typing predictions, this work will significantly advance the field of stem cell research and its applications in regenerative medicine and oncology.
This project is jointly funded by the Mathematical Biology Program in the Division of Mathematical Sciences, the Infrastructure Innovation for Biological Research in the Division of Biological Infrastructure (BIO/DBI), and Office of Strategic Initiatives.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2424854","eMB: Collaborative Research: Using mathematics to bridge between evolutionary dynamics in the hematopoietic systems of mice and humans: from in vivo to epidemiological scales","DMS","MATHEMATICAL BIOLOGY","10/01/2024","08/08/2024","Timmy Ma","LA","Xavier University of Louisiana","Standard Grant","Amina Eladdadi","09/30/2027","$36,644.00","","tma@xula.edu","1 DREXEL DR","NEW ORLEANS","LA","701251056","5045205440","MPS","733400","8038, 9150","$0.00","This project is a collaboration between three institutions: University of California-San Diego, Xavier University of Louisiana, and University of California-Irvine. The human blood contains different cell types that are continuously produced, while older cells die. As this process continues while the organism ages, mistakes are made during cell production, generating mutant cells. These mutants can linger in the blood and become more abundant over time. They can contribute to chronic health conditions and there is a chance that they initiate cancer. It is not well understood why these mutant cells persist and expand. One problem that has held back progress is that for obvious reasons it is impossible to perform experiments with human subjects to investigate this. Mathematics combined with epidemiological data, however, offers a way around this limitation. This project develops mathematical models describing the evolution of mutant cells in the blood over time, using experimental mouse data to define the model structure. New mathematical approaches are then used to adapt this model to the human blood system, by bridging between mathematical models of mutant evolution in the blood, and the epidemiological age-incidence of mutants in the human population. There is broad public health impact, since this work can suggest ways to reduce the mutant cells in patients, which can alleviate chronic health conditions and reduce cancer risk. From the educational perspective, the PIs collaborate with Xavier University of Louisiana, an undergraduate historically black university, to foster enthusiasm in continued education and careers in STEM, and equip students with knowledge and skills to potentially continue in graduate programs at top universities, thus promoting social mobility.
As higher organisms age, tissue cells acquire mutations that can rise in frequency over time. Such clonal evolutionary processes have been documented in many human tissues and have become a major focus for understanding the biology of aging. Gaining more insights into mechanisms that drive mutant emergence in non-malignant human tissues is an important biological / public health question that needs to be addressed to define correlates of tissue aging. While experiments in mice have suggested possible drivers of mutant evolution in tissues, a central unresolved question is whether (and how) knowledge from murine models can be applied to humans. Mathematics provides a new approach to address this challenge: We propose a multiscale approach that uses mathematics to bridge between cellular dynamics of mice and humans, by utilizing epidemiological data of mutant incidence in human populations. We use ?clonal hematopoiesis of indeterminate potential? (CHIP) as a study system, where TET2 and DNMT3A mutant clones emerge in the histologically normal hematopoietic system. Based on stem cell transplantation experiments in mice, we seek to construct a predictive mathematical model of mutant evolution in mice. Using the hazard function, this in vivo model can predict the epidemiological incidence of mutants. Fitting predicted to observed mutant age-incidence data for humans will yield a parameterized and predictive model of human TET2 and DNMT3A mutant evolution. Public health impacts include a better understanding of mutant evolution in the human hematopoietic system, which may lead to evolution-based intervention strategies to reduce CHIP mutant burden.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2424853","eMB: Collaborative Research: Using mathematics to bridge between evolutionary dynamics in the hematopoietic systems of mice and humans: from in vivo to epidemiological scales","DMS","MATHEMATICAL BIOLOGY","10/01/2024","08/08/2024","Dominik Wodarz","CA","University of California-San Diego","Standard Grant","Amina Eladdadi","09/30/2027","$200,812.00","Natalia Komarova","dwodarz@ucsd.edu","9500 GILMAN DR","LA JOLLA","CA","920930021","8585344896","MPS","733400","8038","$0.00","This project is a collaboration between three institutions: University of California-San Diego, Xavier University of Louisiana, and University of California-Irvine. The human blood contains different cell types that are continuously produced, while older cells die. As this process continues while the organism ages, mistakes are made during cell production, generating mutant cells. These mutants can linger in the blood and become more abundant over time. They can contribute to chronic health conditions and there is a chance that they initiate cancer. It is not well understood why these mutant cells persist and expand. One problem that has held back progress is that for obvious reasons it is impossible to perform experiments with human subjects to investigate this. Mathematics combined with epidemiological data, however, offers a way around this limitation. This project develops mathematical models describing the evolution of mutant cells in the blood over time, using experimental mouse data to define the model structure. New mathematical approaches are then used to adapt this model to the human blood system, by bridging between mathematical models of mutant evolution in the blood, and the epidemiological age-incidence of mutants in the human population. There is broad public health impact, since this work can suggest ways to reduce the mutant cells in patients, which can alleviate chronic health conditions and reduce cancer risk. From the educational perspective, the PIs collaborate with Xavier University of Louisiana, an undergraduate historically black university, to foster enthusiasm in continued education and careers in STEM, and equip students with knowledge and skills to potentially continue in graduate programs at top universities, thus promoting social mobility.
As higher organisms age, tissue cells acquire mutations that can rise in frequency over time. Such clonal evolutionary processes have been documented in many human tissues and have become a major focus for understanding the biology of aging. Gaining more insights into mechanisms that drive mutant emergence in non-malignant human tissues is an important biological / public health question that needs to be addressed to define correlates of tissue aging. While experiments in mice have suggested possible drivers of mutant evolution in tissues, a central unresolved question is whether (and how) knowledge from murine models can be applied to humans. Mathematics provides a new approach to address this challenge: We propose a multiscale approach that uses mathematics to bridge between cellular dynamics of mice and humans, by utilizing epidemiological data of mutant incidence in human populations. We use ?clonal hematopoiesis of indeterminate potential? (CHIP) as a study system, where TET2 and DNMT3A mutant clones emerge in the histologically normal hematopoietic system. Based on stem cell transplantation experiments in mice, we seek to construct a predictive mathematical model of mutant evolution in mice. Using the hazard function, this in vivo model can predict the epidemiological incidence of mutants. Fitting predicted to observed mutant age-incidence data for humans will yield a parameterized and predictive model of human TET2 and DNMT3A mutant evolution. Public health impacts include a better understanding of mutant evolution in the human hematopoietic system, which may lead to evolution-based intervention strategies to reduce CHIP mutant burden.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2424748","eMB: Multi-task biologically informed neural networks for learning dynamical systems from single-cell protein expression data","DMS","MATHEMATICAL BIOLOGY","09/01/2024","08/08/2024","Kevin Flores","NC","North Carolina State University","Standard Grant","Amina Eladdadi","08/31/2027","$317,533.00","Orlando Arguello-Miranda","kbflores@ncsu.edu","2601 WOLF VILLAGE WAY","RALEIGH","NC","276950001","9195152444","MPS","733400","075Z, 079Z, 8038","$0.00","A fundamental unanswered biological question with biomedical and industrial relevance is how cells integrate external information from multiple biochemical pathways to enter and exit quiescence in response to stress. Non-destructive measurements of single-cell protein expression in yeast cells have revealed that signaling in quiescence-related biochemical pathways can be highly heterogeneous across genetically identical cell populations as they transition into dormancy, even though they are treated with the exact same stress type and timing. Mathematical modeling has produced remarkable insights into the protein interaction networks that govern the yeast cell cycle and quiescence, however, predicting the transition from proliferation into quiescence at the single-cell level remains unclear under most physiological scenarios. The goal of this project is to develop a mathematical model that recapitulates the heterogeneity in proliferation/quiescence transitions of yeast cells in response to multiple quiescence-promoting stimuli. To accomplish this goal, this project will couple the development of novel scientific machine learning methods, chemical and environmental perturbation experiments, and single-cell protein expression measurements in live yeast cells. This research will address the mathematical challenges of learning and validating mathematical models from heterogeneous high-dimensional time series data to answer significant questions about how signaling pathways govern cell fate and differentiation. The project?s findings will be applicable to quiescence-related phenomena such as chemotherapy-resistant quiescent cancer cells, stem cells that exit quiescence for wound healing, and developmental processes that rely on the ubiquitous stress signaling pathways that will be studied. Research findings will be communicated to the scientific community through conference workshops and minisymposia, and to the general public through the creation of new K-12 outreach exhibits.
The proposed work will develop a data-driven mathematical framework to mechanistically explain inter-cellular variability during proliferation-quiescence transitions. Specifically, new deep learning tools will be developed to directly learn differential equation models from multivariate protein expression data collected from individual yeast cells undergoing quiescence in response to a diverse range of biologically relevant stressors. These research efforts will involve the integration of recurrent neural networks, multi-task learning, and novel regularization methods that enable deep learning models to simultaneously learn differential equations from thousands of single-cell replicates of protein expression time series data. Sensitivity analysis methods will be developed in conjunction with these new deep learning tools to enable optimization within a vast space of stress combinations and timing, thereby generating quantitative predictions about which experimental perturbations have the greatest effect on inter-cellular phenotype variability. The application of the new framework to non-destructive single-cell data arising from state-of-the-art experimental setups will shed new light on how coordinated cell division, stress, and metabolic signaling pathways produce intercellular variability in protein expression and quiescence phenotypes observed across species. In addition, this project will provide interdisciplinary training to graduate and undergraduate students, and develop open-source code for application to biological data sets involving perturbation experiments with multivariate time series data.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2436120","MPOPHC: Integrating human risk perception and social processes into policy responses in an epidemiological model","DMS","OFFICE OF MULTIDISCIPLINARY AC, MATHEMATICAL BIOLOGY, ","10/01/2024","08/07/2024","Brian Beckage","VT","University of Vermont & State Agricultural College","Standard Grant","Zhilan Feng","09/30/2027","$1,344,200.00","Suzanne Lenhart, Charles Sims, Katherine Lacasse","Brian.Beckage@uvm.edu","85 S PROSPECT STREET","BURLINGTON","VT","054051704","8026563660","MPS","125300, 733400, Y20600","008Z, 9150, 9178, 9179","$0.00","Epidemics arise from interactions between pathogens and human hosts, where the pathogen influences human behavior and human behavior influences the spread of the pathogen. The models used to predict pathogen spread do not include the complexity of interactions between disease and human behavior but instead focus on biological processes and policy interventions. However, disease transmission depends on people?s behaviors, which are shaped by their perceptions of risk from the disease and from health interventions, as well as by the opinions and behaviors of the other people around them. This project will contribute to the development of mathematical epidemiological models that better represent the complexities of the human response to disease and that can be used to evaluate the relative impacts of public health policies on disease dynamics. The project will be focused on understanding respiratory diseases such as COVID-19, seasonal flu, and bird flu, but can be readily modified to be broadly applicable to other infectious diseases such as HIV or Ebola. The project will contribute to existing national COVID-19 and Flu Scenario Modeling Hubs that are working to better predict and understand the dynamics of infectious disease and to contribute to policy interventions. The Investigators will disseminate the results and foster connections with the disease modeling community through a workshop for public health professionals and will engage the public through production of educational music videos targeted at the broader community
The complexity of human behavior is not well represented in epidemiological models, contributing to reduced skill and utility of model forecasts. While some epidemiological models represent human behavioral responses using a few static parameters, the Investigators will construct models of human behavior and policy processes that update dynamically to represent the dependence of human responses to the evolving state of the epidemic. Human cognition, social and policy responses will be represented using a system of differential equations linked with a traditional Susceptible-Exposed-Infected-Recovered epidemiological model using infectious respiratory diseases such as SARS-CoV-2 and H5N1 as model systems. Adoption of protective behaviors (vaccination, physical distancing) will be a function of risk perceptions (from disease and health interventions), health policies (lockdowns, vaccine mandates), and the behavior of other people (social norms). Policy interventions and adoption of protective behaviors mediate disease spread and impacts (infections and deaths) that influence human behavioral and policy responses. Mathematical novelty arises because cognition depends upon the history of infection, so the differential equations have past-dependence, generating differential integral equations. Model outputs will be used to analyze the sensitivity of and uncertainty in epidemic forecasts that arise from human risk perceptions, social influence, protective behaviors, and policy interventions. This project will advance the disease modeling community?s capability to analyze the interlinked dynamics of human social systems and infectious disease, increase the impact of social science on the disease modeling community, and will develop analysis methods for the complex and time-dependent interactions that arise from linkages of disease dynamics with social systems.
This award is co-funded by the NSF Division of Mathematical Sciences (DMS) and the CDC Coronavirus and Other Respiratory Viruses Division (CORVD).
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2432168","Conference: Interagency Analysis and Modeling Group/Multiscale Modeling Consortium (IMAG/MSM) Meeting on Operationalizing the NASEM Report on Digital Twins","DMS","STATISTICS, COMPUTATIONAL MATHEMATICS, MATHEMATICAL BIOLOGY","08/15/2024","08/02/2024","Gary An","VT","University of Vermont & State Agricultural College","Standard Grant","Troy D. Butler","07/31/2025","$30,000.00","","gan@med.med.edu","85 S PROSPECT STREET","BURLINGTON","VT","054051704","8026563660","MPS","126900, 127100, 733400","075Z, 079Z, 7556, 9150","$0.00","On December 15, 2023, The National Academies of Sciences, Engineering and Medicine (NASEM) released a report entitled: ?Foundational Research Gaps and Future Directions for Digital Twins? (?NASEM DT REPORT?). The purpose of this report was to bring structure to the burgeoning field of digital twins by providing a working definition and a series of research challenges that need to be addressed to allow this technology to fulfill its full potential. The concept of digital twins is compelling and has the potential to impact a broad range of domains. For instance, digital twins have either been proposed or are currently being developed for manufactured devices, buildings, cities, ecologies and the Earth as a whole. It is natural that the concept be applied to biology and medicine, as the most recognizable concept of a ?twin? is that of identical human twins. The application of digital twins to biomedicine also follows existing trends of Personalized and Precision medicine, in short: ?the right treatment for the right person at the right time.? Fulfilling the promise of biomedical digital twins will require multidisciplinary Team Science that brings together various experts from fields as diverse as medicine, computer science, engineering, biological research, advanced mathematics and ethics. The purpose of this conference, the ?2024 Interagency Modeling and Analysis Group (IMAG)/Multiscale Modeling (MSM) Consortium Meeting: Setting up Teams for Biomedical Digital Twins,? is to do exactly this: bringing together such needed expertise in a series of teaming exercises to operationalize the findings of the NASEM DT REPORT in the context of biomedical digital twins. As part of outreach and training efforts to broaden the participation within this growing field, this workshop will provide support for both traditionally under-represented categories of senior researchers as well as junior researchers such as graduate students and postdoctoral researchers.
Facilitating the development and deployment of biomedical digital twins requires operationalizing the findings and recommendations of the NASEM DT REPORT, which raises a series of specific and unique challenges in the biomedical domain. More specifically, there are numerous steps that need to be taken to convert the highly complex simulation models of biological processes developed by members of the MSM Consortium into biomedical digital twins that are compliant with the definition of digital twins presented in the NASEM DT REPORT. There are also identified challenges associated with these various steps. Some of these challenges can benefit from lessons learned in other domains that have developed digital twins while others will require the development of new techniques in the fields of statistics, computational mathematics and mathematical biology. This task will require multidisciplinary collaborations between mathematicians, computational researchers, experimental biologists and clinicians. This IMAG/MSM meeting will promote the concepts of Team Science to bring together experienced multiscale modeling researchers and experts from the mathematical, statistical, computational, experimental and clinical communities to form the multidisciplinary teams needed to operationalize the findings of the NASEM DT REPORT. The website for this meeting is at https://www.imagwiki.nibib.nih.gov/news-events/announcements/2024-imagmsm-meeting-september-30-october-2-2024, with the landing page for the Interagency Modeling and Analysis Group at https://www.imagwiki.nibib.nih.gov/.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
-"2425995","Conference: The SIAM Quantum Intersections Convening","DMS","FET-Fndtns of Emerging Tech, OFFICE OF MULTIDISCIPLINARY AC, INFRASTRUCTURE PROGRAM, APPLIED MATHEMATICS, TOPOLOGY, FOUNDATIONS, STATISTICS, QIS - Quantum Information Scie, MATHEMATICAL BIOLOGY","08/01/2024","07/31/2024","Suzanne Weekes","PA","Society For Industrial and Applied Math (SIAM)","Standard Grant","Tomek Bartoszynski","07/31/2025","$349,996.00","","weekes@siam.org","3600 MARKET ST FL 6","PHILADELPHIA","PA","191042669","2153829800","MPS","089Y00, 125300, 126000, 126600, 126700, 126800, 126900, 728100, 733400","7203, 7556","$0.00","Society for Industrial and Applied Mathematics (SIAM) will host the SIAM Quantum Intersections Convening - Integrating Mathematical Scientists into Quantum Research to bring quantum-curious mathematical scientists together with leading experts in quantum science for a three-day interactive workshop. Recognizing the critical role of mathematical scientists, this convening aims to promote multidisciplinary collaborations that bridge the gap between mathematics and quantum sciences and aims to foster and increase the involvement and visibility of mathematicians and statisticians in quantum science research and education. The convening will be organized by a steering committee and will be supported by professional facilitators. Participants will learn from and connect with physicists, computer scientists, engineers and mathematical scientists who are experts in quantum science. This in-person gathering will be held in fall 2024 in the Washington DC area. A primary deliverable from the convening will be a report summarizing the activities and recommendations generated during the event. Key presentations will be recorded and will be available on a SIAM webpage.
Society for Industrial and Applied Mathematics (SIAM) will host this convening with the goals of (i) making more mathematical scientists aware of the demand for their expertise in quantum research and articulating areas and problems where they can contribute, (ii) increasing the participation of researchers in mathematical sciences in the quantum information science revolution to accelerate its research and development, (iii) providing a seeding ground for partnerships and collaborations of mathematical scientists with physicists, computer scientists, and engineers from industry and academia, and (iv) recommending activities to develop a quantum science and technology workforce pipeline in the mathematical and computational sciences. A few topics in quantum science where mathematics can help research and discovery include quantum computing, quantum algorithms, quantum optimization, quantum error corrections, quantum information theory, quantum cryptography, quantum sensing and metrology, and quantum networks.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2419308","Conference: Ninth International Conference on Mathematical Modeling and Analysis of Populations in Biological Systems","DMS","MATHEMATICAL BIOLOGY","08/01/2024","07/31/2024","Peter Hinow","WI","University of Wisconsin-Milwaukee","Standard Grant","Amina Eladdadi","07/31/2025","$20,000.00","Gabriella Pinter, Istvan Lauko","hinow@uwm.edu","3203 N DOWNER AVE # 273","MILWAUKEE","WI","532113188","4142294853","MPS","733400","7556","$0.00","The Ninth International Conference on Mathematical Modeling and Analysis of Populations in Biological Systems (ICMA-IX) will be held at the University of Wisconsin - Milwaukee, on October 18 - 19, 2024. ICMA-IX builds upon the success of eight previous conferences, each of which had over 100 participants. The aim of this conference is to have participants from a wide variety of backgrounds, various career levels, and from underrepresented groups. The conference will highlight significant recent developments in mathematical biology and provide a forum for the participants to meet, to communicate their scientific discoveries, and to initiate new collaborations. By bringing together a new generation of researchers along with the established experts, the aim is to cultivate new collaborations and networks that can help junior researchers as they advance their careers.
Mathematical modeling in biology and medicine is becoming ever more important and impactful. This became more visible than ever to the general public during the years of the recent Covid-19 pandemic when crucial policy recommendations were made by epidemiologists and modelers based on mathematical models of the dynamics of the disease. Further applications include a better understanding of cancer spread and its treatment, the working of the nervous system and interactions of species in ecosystems at various scales. ICMA-IX will continue the tradition of including a plenary talk based on the paper that won the Lord May Prize awarded by the Journal of Biological Dynamics. The conference will feature mathematical modeling in biology diverse in both its applications and its techniques. By its very nature, this research is interdisciplinary and collaborative. Thus, conferences are crucial events to bring together researchers and students from different areas.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2425995","Conference: The SIAM Quantum Intersections Convening","DMS","FET-Fndtns of Emerging Tech, OFFICE OF MULTIDISCIPLINARY AC, INFRASTRUCTURE PROGRAM, APPLIED MATHEMATICS, TOPOLOGY, FOUNDATIONS, STATISTICS, QIS - Quantum Information Scie, MATHEMATICAL BIOLOGY","08/01/2024","07/31/2024","Suzanne Weekes","PA","Society For Industrial and Applied Math (SIAM)","Standard Grant","Tomek Bartoszynski","07/31/2025","$349,996.00","","weekes@siam.org","3600 MARKET ST FL 6","PHILADELPHIA","PA","191042669","2153829800","MPS","089Y00, 125300, 126000, 126600, 126700, 126800, 126900, 728100, 733400","7203, 7556","$0.00","Society for Industrial and Applied Mathematics (SIAM) will host the SIAM Quantum Intersections Convening - Integrating Mathematical Scientists into Quantum Research to bring quantum-curious mathematical scientists together with leading experts in quantum science for a three-day interactive workshop. Recognizing the critical role of mathematical scientists, this convening aims to promote multidisciplinary collaborations that bridge the gap between mathematics and quantum sciences and aims to foster and increase the involvement and visibility of mathematicians and statisticians in quantum science research and education. The convening will be organized by a steering committee and will be supported by professional facilitators. Participants will learn from and connect with physicists, computer scientists, engineers and mathematical scientists who are experts in quantum science. This in-person gathering will be held in fall 2024 in the Washington DC area. A primary deliverable from the convening will be a report summarizing the activities and recommendations generated during the event. Key presentations will be recorded and will be available on a SIAM webpage.
Society for Industrial and Applied Mathematics (SIAM) will host this convening with the goals of (i) making more mathematical scientists aware of the demand for their expertise in quantum research and articulating areas and problems where they can contribute, (ii) increasing the participation of researchers in mathematical sciences in the quantum information science revolution to accelerate its research and development, (iii) providing a seeding ground for partnerships and collaborations of mathematical scientists with physicists, computer scientists, and engineers from industry and academia, and (iv) recommending activities to develop a quantum science and technology workforce pipeline in the mathematical and computational sciences. A few topics in quantum science where mathematics can help research and discovery include quantum computing, quantum algorithms, quantum optimization, quantum error corrections, quantum information theory, quantum cryptography, quantum sensing and metrology, and quantum networks.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2350184","Dynamical Systems with a View towards Applications","DMS","ANALYSIS PROGRAM, MATHEMATICAL BIOLOGY","07/01/2024","04/10/2024","Lai-Sang Young","NY","New York University","Continuing Grant","Marian Bocea","06/30/2029","$375,506.00","","lsy@cims.nyu.edu","70 WASHINGTON SQ S","NEW YORK","NY","100121019","2129982121","MPS","128100, 733400","5916, 5918, 5936, 5946, 7406","$0.00","The project will broaden the reach of the existing mathematical theory of dynamical systems, and will contribute to bridging the gap between theory and application. The theory of dynamical systems lies at the crossroads of several areas of mathematics, and has natural applications to engineering and to other scientific disciplines. In this project, the principal investigator will extend relevant dynamical results from the finite dimensional case to the infinite dimensional case. These include results about dynamical semi-flows generated by evolutionary partial differential equations. Such equations model a variety of physical phenomena. A second component of the project consists in leveraging the principal investigator?s expertise in interdisciplinary research to identify recurrent themes and emergent phenomena arising naturally in the biological sciences, thereby incorporating new phenomenology into a modern theory of dynamical systems. In addition to these scientific advances, the proposed projects offer ample training opportunities for students and postdocs.
The project centers on four lines of research. The first two lines seek to extend finite-dimensional phenomena to infinite dimensions. In the first project, the principal investigator aims to show that in the presence of random forces, a unifying description of large-time orbit distribution holds much more generally than is currently known. The second project seeks to extract low dimensional structures and dynamical phenomena embedded in high dimensions. Specifically, the PI will aim to show that shear-induced chaos is a source of instability in physical models including the Navier-Stokes system. The remaining two projects investigate a class of reaction networks of relevance to biology. Mean-field approaches to the large-time behavior of scalable networks will be investigated. The project also aims to study the novel concept of `depletion?, a bifurcation phenomenon amenable to mathematical analysis. From the viewpoint of applications, depletion occurs naturally in several contexts and potentially has dire biological consequences. The final project seeks to use scalable reaction networks as a model to answer a question of fundamental importance for dissipative dynamical systems, namely, which invariant measures are visible from an observational viewpoint?
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413769","Connecting ecosystem models with evolutionary genetics: Fitness landscapes, persistent networks, and phylodynamic simulation","DMS","MATHEMATICAL BIOLOGY","08/01/2024","07/31/2024","Cameron Browne","LA","University of Louisiana at Lafayette","Standard Grant","Zhilan Feng","07/31/2027","$261,226.00","","cxb0559@louisiana.edu","104 E UNIVERSITY AVE","LAFAYETTE","LA","705032014","3374825811","MPS","733400","9150","$0.00","The evolution of ecological networks depends on the underlying population dynamics, genetics, and interactions of the composite species. In the single species context, theoretical models have simplified the study of adaptation to genotype/phenotype fitness, however complex eco-evolutionary dynamics in multi-species communities have challenged researchers. For example, in microbial and virus-immune response ecosystems, a dynamic fitness landscape determines whether pathogen strains can gain several resistance mutations. This project aims to bridge ecosystem dynamics with evolutionary genetics by analyzing ecological models evolving on fitness landscapes and developing a computational platform for incorporating phylogenetics. The advances can inform vaccines and therapies which prevent pathogen resistance against multiple immune responses or drugs. This research will engage undergraduate and graduate students, providing interdisciplinary training in computational and mathematical biology.
Recent work has sought to understand assembly of interacting species in ecosystem models. However, the overwhelming number of species combinations and connecting the models to evolution have challenged researchers, especially in higher dimensional systems. This project aims to address this gap by looking through an evolutionary genetics lens, representing species variants as binary sequences which encode ecological interactions and fitness landscapes. First, persistence and stability of equilibria in models of ecological networks will be analyzed and linked to epistasis (non-additivity) in underlying fitness landscapes, facilitating simplifying rules for ecosystem evolution. Predator-prey and consumer-resource networks pertaining to viral evolution, microbiomes and antibiotic resistance, along with applications to immunotherapy and treatments, will be considered. Next, the PI will construct a flexible computational method for jointly simulating eco-evolutionary trajectories and phylogenetic trees, which can validate the theoretical results, and confront both genetic and population dynamic data. The work will involve interdisciplinary collaboration into how fitness landscapes shape ecological network evolution for HIV-immune dynamics, and microbial resistance to antibiotics and phage infection. Through dynamical systems, stochastic simulation, combinatoric and computational analysis, techniques will be developed for connecting population dynamics and evolutionary genetics.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2338630","CAREER: Multiscale Model for Cell Morphogenesis and Tissue Development in Plant Leaves","DMS","PLANT FUNGAL & MICROB DEV MECH, MATHEMATICAL BIOLOGY","07/01/2024","02/23/2024","Weitao Chen","CA","University of California-Riverside","Continuing Grant","Amina Eladdadi","06/30/2029","$185,464.00","","weitaoc@ucr.edu","200 UNIVERSTY OFC BUILDING","RIVERSIDE","CA","925210001","9518275535","MPS","111800, 733400","068Z, 1045, 8038","$0.00","This study will improve our understanding about general mechanisms involved in cell morphogenesis. Epidermal cells in plant leaves, especially pavement cells (PCs), exhibit interdigitated puzzle shapes. Shape formation of PCs serves as an ideal model to understand principles governing cell morphogenesis and tissue growth in developmental biology. During the development, PCs change from small rounded polygonal shapes into large interdigitated puzzle shapes, under the regulation of the extracellular plant hormone and multiple key molecules accumulated at the plasma membrane (PM). This developmental process also involves reorganization of cytoskeletal components and dynamic cell-to-cell communication. The multiscale mathematical model proposed in this research will have a broad range of applications from tissue engineering to biomanufacturing and biotechnology. The PI will develop a satellite undergraduate research program based on University of California, Riverside (UCR), a Hispanic-serving institution with many first generation college students. The program will also recruit students from California State Universities and high schools in local public school district. Research symposia and summer research programs will be organized through the coordination with department of mathematics, Association of Women in Mathematics at UCR Chapter, Interdisciplinary Center for Quantitative Modeling in Biology at UCR, and nearby colleges to promote public awareness on research and career paths in mathematical biology.
The morphogenesis of PCs is a complex process. During the early stage, extracellular hormone induces nanoclustering on the PM of individual cells to initiate cell polarization. It is followed by nonhomogeneous mechanical forces exerted along cell wall and a subcellular signaling gradient to regulate cell-cell interaction. Stable cell polarization, further the interdigitated jigsaw cell shapes, will be established under the regulations between the nanoclustering signals and structural components in the same cell and regulations of Rho GTPase signals between neighboring cells. Different hypotheses have been proposed for the mechanism underlying the PC morphogenesis, while most of them focus on either chemical signals or mechanical properties in PCs. The proposed study will test new hypotheses that include both the chemical signaling network and mechanical properties, as well as interactions between them, and therefore provide novel insights into the fundamental principles of cell shape formation and tissue development. This project will develop a multiscale model using the local level set method to incorporate both chemical signals and mechanical properties in a multicellular environment to test different hypotheses on the shape formation of PCs. Machine learning techniques and experimental data will be used in the modeling selection, parameter estimation and model calibration.
This CAREER project is jointly funded by the Mathematical Biology Program at the Division of Mathematical Science and the Developmental Systems Cluster at the Division of Integrative Organismal Systems in the Directorate for Biological Sciences.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2339241","CAREER: Learning stochastic spatiotemporal dynamics in single-molecule genetics","DMS","Cellular Dynamics and Function, STATISTICS, MATHEMATICAL BIOLOGY","07/01/2024","01/29/2024","Christopher Miles","CA","University of California-Irvine","Continuing Grant","Amina Eladdadi","06/30/2029","$239,517.00","","cemiles@uci.edu","160 ALDRICH HALL","IRVINE","CA","926970001","9498247295","MPS","111400, 126900, 733400","068Z, 079Z, 1045, 7465, 8038","$0.00","The ability to measure which genes are expressed in cells has revolutionized our understanding of biological systems. Discoveries range from pinpointing what makes different cell types unique (e.g., a skin vs. brain cell) to how diseases emerge from genetic mutations. This gene expression data is now a ubiquitously used tool in every cell biologist?s toolbox. However, the mathematical theories for reliably extracting insight from this data have lagged behind the amazing progress of the techniques for harvesting it. This CAREER project will develop key theoretical foundations for analyzing imaging data of gene expression. The advances span theory to practice, including developing mathematical models and machine-learning approaches that will be used with data from experimental collaborators. Altogether, the project aims to create a new gold standard of techniques in studying spatial imaging data of gene expression and enable revelation of new biological and biomedical insights. In addition, this proposed research will incorporate interdisciplinary graduate students and local community college undergraduates to train the next generation of scientists in the ever-evolving intersection of data science, biology, and mathematics. Alongside research activities, the project will create mentorship networks for supporting first-generation student scientists in pursuit of diversifying the STEM workforce.
The supported research is a comprehensive program for studying single-molecule gene expression spatial patterns through the lens of stochastic reaction-diffusion models. The key aim is to generalize mathematical connections between these models and their observation as spatial point processes. The new theory will incorporate factors necessary to describe spatial gene expression at subcellular and multicellular scales including various reactions, spatial movements, and geometric effects. This project will also establish the statistical theory of inference on the resulting inverse problem of inferring stochastic rates from only snapshots of individual particle positions. Investigations into parameter identifiability, optimal experimental design, and model selection will ensure robust and reliable inference. In complement to the developed theory, this project will implement and benchmark cutting-edge approaches for efficiently performing large-scale statistical inference, including variational Bayesian Monte Carlo and physics-informed neural networks. The culmination of this work will be packaged into open-source software that infers interpretable biophysical parameters from multi-gene tissue-scale datasets.
This CAREER Award is co-funded by the Mathematical Biology and Statistics Programs at the Division of Mathematical Sciences and the Cellular Dynamics & Function Cluster in the Division of Molecular & Cellular Biosciences, BIO Directorate.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2339000","CAREER: Dynamics and harvesting of stochastic populations","DMS","MATHEMATICAL BIOLOGY","07/01/2024","01/10/2024","Alexandru Hening","TX","Texas A&M University","Continuing Grant","Zhilan Feng","06/30/2029","$45,704.00","","ahening@tamu.edu","400 HARVEY MITCHELL PKY S STE 30","COLLEGE STATION","TX","778454375","9798626777","MPS","733400","068Z, 1045","$0.00","Environmental fluctuations have been shown to drive populations extinct, facilitate persistence, reverse competitive exclusion, change genetic diversity, and modify the spread of infectious diseases. It is important to study the interplay between environmental fluctuations, both deterministic and random, and the persistence of interacting species. Developing a rigorous mathematical theory for coexistence, in conjunction with data-driven applications, will help theoretical ecologists pinpoint how harvesting and periodic or random environmental fluctuations affect the long term dynamics of ecological communities. Global climate change models predict increasing temporal variability in temperature, precipitation and storms in the next century. The research project will provide much-needed theoretical underpinning for this fast-moving area. The application related to the harvesting of marine animals will be key for conservation and management of vulnerable or endangered species. Questions around optimal control of stochastic models are vital in today's world where there are multiple global crises in a changing environment as well as species loss. Ecologists and evolutionary biologists invoke stochasticity as a key determinant of everything from population genetics to extinction risk. But the exposure that scientists from such disciplines actually get to the mathematical concepts underpinning stochastic processes is incomplete. An integral component of the educational objectives will be the organization of a summer school at the interface of biology and stochastics targeted to advanced undergraduate and graduate students from mathematics and biology.
In order to have realistic models for the coexistence of species it is important to incorporate both periodic and random environmental fluctuations. Connecting ideas from dynamical systems and stochastic processes, it will be possible to show that the long-term dynamics is determined by the invasion rates (Lyapunov exponents) of the periodic measures living on the boundary of the state space. The developed ideas will then be used to look at non-stationary community theory where the long term behavior of the system can not be described by an equilibrium, an attractor, or a stationary distribution. An important question from conservation biology is how to harvest a given population in order to maximize the yield while not driving the population extinct. While there are a few results for single-species systems, little is known in the significantly more realistic setting of interacting species. By using a combination of novel approaches from stochastic control and Markov chain approximation methods one can analyze multi-species harvesting problems and then apply the results in order to gain insight for important real-life applications from fishery management.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2344576","Risk Factor Analysis and Dynamic Response for Epidemics in Heterogeneous Populations","DMS","Human Networks & Data Sci Res, MATHEMATICAL BIOLOGY","09/01/2024","02/14/2024","Thomas Barthel","NC","Duke University","Continuing Grant","Zhilan Feng","08/31/2027","$183,131.00","James Moody, Charles Nunn","barthel@phy.duke.edu","2200 W MAIN ST","DURHAM","NC","277054640","9196843030","MPS","147Y00, 733400","068Z","$0.00","In today's highly connected world, the prevention, prediction, and control of epidemics is of paramount importance for global health, economic productivity, and geopolitical stability. Numerous infectious disease outbreaks over the past two decades have demonstrated the need for epidemiological modeling. They also revealed shortcomings of existing scientific techniques to accurately predict epidemic dynamics and to devise effective control strategies. This project will establish a new efficient simulation method that makes it possible to assess rare but highly consequential events. It will be used to identify decisive risk factors concerning the fabric of virus-spreading interactions that can facilitate large epidemic outbreaks. A well-documented example are superspreading events that played an important role in the COVID-19 pandemic. The investigations will be focused on models for diseases similar to COVID-19 and HIV as archetypal cases. The improved understanding and models of epidemiological processes will be used to devise and analyze efficient preventive strategies with the goal of providing more reliable guidance for the general public and health-policy decision makers, saving lives and resources.
Traditionally, the dynamics of infectious diseases are studied on the basis of deterministic compartmental models, where the population is divided into large groups, and deterministic differential equations for the group sizes are employed to investigate disease dynamics. Classical examples are the deterministic SIR and SIS models. This is a strong simplification of reality that ignores to a large extent the heterogeneity in contact patterns and biomedically relevant attributes across the population as well as the stochastic nature of infection processes. Both have a decisive impact on the dynamics at the early stages of epidemic outbreaks and need to be incorporated to enable reliable predictions. Markov-chain Monte Carlo methods can sample more realistic stochastic agent-based dynamics, but cannot efficiently assess the preconditions leading to rare consequential events. The project will address this challenge with a new numerical technique that allows one to efficiently sample important but rare epidemic trajectories of realistic models under suitable constraints. The research will renew attention on the crucial role of rare events in the genesis of large outbreaks, including combinations of bottlenecks in contact networks and the stochastic nature of the disease dynamics. Risk-factor analysis based on the new method will provide answers to cutting-edge questions in disease diffusion concerning outbreak preconditions, information flow, and control strategies. This approach will open new avenues for research on the prevention and control of epidemics.
This project is jointly funded by the Mathematical Biology program of the Division of Mathematical Sciences (DMS) in the Directorate for Mathematical and Physical Sciences (MPS) and the Human Networks and Data Science program (HNDS) of the Division of Behavioral and Cognitive Sciences (BCS) in the Directorate for Social, Behavioral and Economic Sciences (SBE).
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
-"2327844","IHBEM: Using socioeconomic, behavioral and environmental data to understand disease dynamics: exploring COVID-19 outcomes in Oklahoma","DMS","MATHEMATICAL BIOLOGY, MSPA-INTERDISCIPLINARY","01/01/2024","08/23/2023","Patrick Stephens","OK","Oklahoma State University","Continuing Grant","Zhilan Feng","12/31/2026","$425,713.00","Lucas Stolerman, Juwon Hwang, Tao Hu, Rebecca Kaplan","patrick.stephens@okstate.edu","401 WHITEHURST HALL","STILLWATER","OK","740781031","4057449995","MPS","733400, 745400","068Z, 079Z, 9102, 9150, 9179","$0.00","One of the most critical modern challenges is to better understand the where, why and how oflarge disease outbreak occurrence. Research shows that the frequency of large disease outbreaks is increasing over time globally, and yet differences in outcomes remain poorly understood. This research will explore the factors that drove variation in COVID-19 outcomes across the counties and metropolitan areas of Oklahoma, particularly which areas had more or fewer cases than would be expected based on their overall population size. The investigators will look at both environmental factors, such as weather patterns and air quality, and socioeconomic factors such as numbers of doctors and differences in the proportion of individuals that were willing to be vaccinated. The investigators will also conduct surveys of individual across the state to try and better understand why people made the healthcare choices that they did and how behavior drove differences in outcomes. Understanding all of these factors requires a team with diverse expertise. Traditionally, most mathematical and quantitative models for disease dynamics have been developed and studied by mathematicians, ecologists, and computer scientists. However, understanding differences in attitudes towards health care measures and how they originate is more the purview of social scientists and historians. By building a team of collaborators spanning all of these disciplines, the research team will be able to build a more complete picture of COVID-19 outcomes in Oklahoma. This will in turn suggest what actions may be most effective to try and best mitigate the effects of both COVID and other large-scale disease events in the future. The final product of this work will include a new data repository and a public-facing intelligent epidemiological modeling platform powered by Jupyter Notebooks. The project will also provide outreach and training, including to students from underrepresented groups.
Increases in outbreak frequency seem to be related to globalization and other human activities. Yet the effects of most human behavioral, social and economic factors on outbreak risk are rarely quantified. Relevant social factors can be hard to measure, often needing specialists to generate and interpret data. However social scientists with expertise to do so are rarely trained in mathematical modelling of disease dynamics. To address these challenges, the investigators will focus on developing data sources and mathematical models that can be used to explore COVID-19 outcomes in Oklahoma. The project will be a true collaboration between social scientists and experts in modelling infectious diseases. Oklahoma is understudied, and is spatially heterogeneous such that models of disease dynamics in Oklahoma are likely to be generalizable to many other regions of the US. The Investigators will generate protocols for standardizing existing data on behavioral and socioeconomic factors as well as develop new data sources. The team will develop statistical models of past outbreaks, and mathematical models reflecting factors shown to have driven COVID-19 dynamics empirically. The latter work will demonstrate how baseline SIR-like models can be modified to reflect human behavioral factors. The Investigators will also contrast the performance of models based on existing data on socioeconomic factors with models incorporating new survey data on variation in behaviors and attitudes related to primary and secondary prevention. The code and datasets to be generated will be made freely available and searchable in an intelligent epidemiological modeling framework, which will enable other researchers to easily iterate on them.
This project is jointly funded by the Division of Mathematical Sciences (DMS) in the Directorate of Mathematical and Physical Sciences (MPS) and the Division of Social and Economic Sciences (SES) in the Directorate of Social, Behavioral and Economic Sciences (SBE).
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2428961","Characterization and Prediction of Viral Capsid Geometries","DMS","MATHEMATICAL BIOLOGY","02/01/2024","05/30/2024","Antoni Luque","FL","University of Miami","Continuing Grant","Zhilan Feng","08/31/2024","$22,114.00","","axl4306@miami.edu","1320 SOUTH DIXIE HIGHWAY STE 650","CORAL GABLES","FL","331462919","3052843924","MPS","733400","","$0.00","Viruses are the most abundant biological entity on the planet and play a crucial role in the evolution of organisms and the biogeochemistry of Earth. Closely related viruses, however, can have very dissimilar genomes, complicating integration of knowledge acquired from the study of independent viruses, and limiting prediction of the characteristics and potential threats of emerging viruses. Viruses, however, conserve a few structural properties that could help circumvent this problem. Most viruses store their infective genetic material in a protein shell called a capsid. The capsid self-assembles from multiple copies of the same (or similar) proteins, and most capsids display icosahedral symmetry. This architecture optimizes the interaction of proteins and the volume available to store the viral genetic information. This research project hypothesizes that viruses have evolved a limited set of replication strategies to specialize and exploit the reduced number of geometrical templates capable of forming icosahedral capsids. This, in turn, may have constrained the number of three-dimensional configurations adopted by capsid proteins, providing a mechanistic rationale for the existence of viral structural lineages. This hypothesis will be tested by analyzing and comparing hundreds of viruses from multiple different viral families using novel mathematical methods. Confirming this hypothesis will offer a quantitative framework to study viral evolution and open the door to design of generic antiviral strategies targeting viruses in the same structural lineage.
Only ten protein folds have been identified among major capsid proteins of viruses that form icosahedral capsids. These folds define viral lineages that group viruses that can be genetically unrelated and infect hosts from different domains of life. This limited number of folds contrasts with the vast genetic diversity of viruses. The existence of these folds across the virosphere, however, remains unknown. Here, it is hypothesized that there is a direct relationship between the viral replication strategy of each viral lineage, the icosahedral lattice of the capsid, and the fold of capsid proteins. The hypothesis will be tested by developing a database that will include the viral replication, protein fold, and capsid lattice of five hundred viruses that have been reconstructed at high or medium molecular resolution. Voronoi tessellations and protein-protein interaction lattices will be obtained to identify computationally the icosahedral lattice associated to each virus. Additionally, molecular measurements of the reconstructed capsids will be obtained to establish allometric relationships for at least one viral lineage, facilitating the prediction of icosahedral capsid properties from genomic information. The new icosahedral framework will be also extended to obtain new sets of elongated capsids, which represent the second most abundant type of capsid. The methods will be disseminated online for use by viral structure researchers.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2327844","IHBEM: Using socioeconomic, behavioral and environmental data to understand disease dynamics: exploring COVID-19 outcomes in Oklahoma","DMS","MATHEMATICAL BIOLOGY, MSPA-INTERDISCIPLINARY","01/01/2024","08/23/2023","Patrick Stephens","OK","Oklahoma State University","Continuing Grant","Zhilan Feng","12/31/2026","$425,713.00","Lucas Stolerman, Juwon Hwang, Tao Hu, Rebecca Kaplan","patrick.stephens@okstate.edu","401 WHITEHURST HALL","STILLWATER","OK","740781031","4057449995","MPS","733400, 745400","068Z, 079Z, 9102, 9150, 9179","$0.00","One of the most critical modern challenges is to better understand the where, why and how oflarge disease outbreak occurrence. Research shows that the frequency of large disease outbreaks is increasing over time globally, and yet differences in outcomes remain poorly understood. This research will explore the factors that drove variation in COVID-19 outcomes across the counties and metropolitan areas of Oklahoma, particularly which areas had more or fewer cases than would be expected based on their overall population size. The investigators will look at both environmental factors, such as weather patterns and air quality, and socioeconomic factors such as numbers of doctors and differences in the proportion of individuals that were willing to be vaccinated. The investigators will also conduct surveys of individual across the state to try and better understand why people made the healthcare choices that they did and how behavior drove differences in outcomes. Understanding all of these factors requires a team with diverse expertise. Traditionally, most mathematical and quantitative models for disease dynamics have been developed and studied by mathematicians, ecologists, and computer scientists. However, understanding differences in attitudes towards health care measures and how they originate is more the purview of social scientists and historians. By building a team of collaborators spanning all of these disciplines, the research team will be able to build a more complete picture of COVID-19 outcomes in Oklahoma. This will in turn suggest what actions may be most effective to try and best mitigate the effects of both COVID and other large-scale disease events in the future. The final product of this work will include a new data repository and a public-facing intelligent epidemiological modeling platform powered by Jupyter Notebooks. The project will also provide outreach and training, including to students from underrepresented groups.
Increases in outbreak frequency seem to be related to globalization and other human activities. Yet the effects of most human behavioral, social and economic factors on outbreak risk are rarely quantified. Relevant social factors can be hard to measure, often needing specialists to generate and interpret data. However social scientists with expertise to do so are rarely trained in mathematical modelling of disease dynamics. To address these challenges, the investigators will focus on developing data sources and mathematical models that can be used to explore COVID-19 outcomes in Oklahoma. The project will be a true collaboration between social scientists and experts in modelling infectious diseases. Oklahoma is understudied, and is spatially heterogeneous such that models of disease dynamics in Oklahoma are likely to be generalizable to many other regions of the US. The Investigators will generate protocols for standardizing existing data on behavioral and socioeconomic factors as well as develop new data sources. The team will develop statistical models of past outbreaks, and mathematical models reflecting factors shown to have driven COVID-19 dynamics empirically. The latter work will demonstrate how baseline SIR-like models can be modified to reflect human behavioral factors. The Investigators will also contrast the performance of models based on existing data on socioeconomic factors with models incorporating new survey data on variation in behaviors and attitudes related to primary and secondary prevention. The code and datasets to be generated will be made freely available and searchable in an intelligent epidemiological modeling framework, which will enable other researchers to easily iterate on them.
This project is jointly funded by the Division of Mathematical Sciences (DMS) in the Directorate of Mathematical and Physical Sciences (MPS) and the Division of Social and Economic Sciences (SES) in the Directorate of Social, Behavioral and Economic Sciences (SBE).
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2330970","Trait-shift induced interaction modification: How individual variation affects ecosystem stability","DMS","Population & Community Ecology, MATHEMATICAL BIOLOGY","05/01/2024","02/01/2024","BingKan Xue","FL","University of Florida","Standard Grant","Amina Eladdadi","04/30/2027","$561,268.00","Robert Holt, Mathew Leibold","b.xue@ufl.edu","1523 UNION RD RM 207","GAINESVILLE","FL","326111941","3523923516","MPS","112800, 733400","068Z, 124Z","$0.00","Individual organisms of the same species can exhibit substantial variation in traits that are important for their interaction with the environment and other species. The distribution of such variable traits within a species can shift over time. For example, behavioral traits can change quickly in response to environmental change or species interactions. The dynamic shift of trait distribution within a species can modify the interaction patterns among species and ultimately affect the stability of ecosystems. However, the prevalence and significance of such effects have not been evaluated systematically. This research project aims to fill in the gap by integrating theoretical and empirical approaches. A mathematical framework will be developed to classify different types of interaction modification and identify conditions that would destabilize ecosystems. Additionally, a comprehensive survey of empirical studies will be conducted to assess the level of evidence for the various types of interaction modification identified above and provide criteria for evaluating past and future case studies. This project will train young researchers in the interdisciplinary area between mathematics and ecology to address important environmental and ecological problems facing the next generation. In addition, educational kits and programs will be designed to promote teaching of important ecological concepts and engage underrepresented groups in K-12 schools.
Intraspecific trait variation has been increasingly recognized as an important factor in determining ecological and evolutionary dynamics. Although some work has examined how the variation of heritable traits affects eco-evolutionary dynamics, non-heritable variation caused by phenotypic plasticity, developmental differences, or species interactions has received less consideration. Ecosystems have traditionally been studied as dynamical systems with fixed interaction strengths, but these interaction strengths will no longer be constant if the trait distributions can shift within short timescales. Such ?trait-shift induced interaction modification? (TSIIM) can lead to higher-order interactions among species, causing species extinction or coexistence that are otherwise unexpected. This project will develop a theoretical framework for studying TSIIM by generalizing traditional dynamical systems to incorporate intraspecific trait variation and categorizing these systems into different network motifs. The effect on large ecosystems will be studied using a disordered systems approach augmented by the new motifs. Meta-analysis of these motifs using empirical studies will provide insight on the mechanisms by which higher-order interactions arise from TSIIM, prompting experimental searches for such phenomena and their consequences.
This project is jointly funded by the Mathematical Biology Program at the Division of Mathematical Sciences and the Population and Community Ecology Cluster in the Division of Environmental Biology at the Directorate for Biological Sciences.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2327892","Collaborative Research: RUI: Topological methods for analyzing shifting patterns and population collapse","DMS","Population & Community Ecology, MATHEMATICAL BIOLOGY, EPSCoR Co-Funding","02/01/2024","01/25/2024","Laura Storch","ME","Bates College","Standard Grant","Zhilan Feng","01/31/2027","$169,439.00","","lstorch@bates.edu","2 ANDREWS ROAD","LEWISTON","ME","042406030","2077868375","MPS","112800, 733400, 915000","9150, 9229","$0.00","Profound and irreversible changes in ecosystems, such as population collapse, are occurring globally due to climate change, habitat destruction, and overuse of natural resources, and are only expected to become more frequent in the future. To prevent an impending collapse, we must recognize the early warning signs. This is particularly challenging in ecological systems due to their naturally complex behavior in both space and time, as well as noisy and/or poorly resolved data. In this project, the investigators will use a novel approach for early detection of impending population collapse, and apply the methodology to spatially distributed populations, for example, a grassland. They utilize a method called computational topology, which can quantify features of a population distribution pattern, such as the level of patchiness in the pattern. In previous work, the investigators used a spatial population model to quantify the changes in a population distribution pattern that occurred as the population went extinct and observed a ""topological route to extinction"". In this project, the investigators will develop and extend the methodology for use in stochastic population models and real-world data sets, which are expected to contain high levels of noise and/or missing/corrupted data. The developed methodology will serve as an additional tool for the prediction of impending population collapse. This tool can then be used by conservation biologists and natural resource managers in order to assist in preserving vulnerable species and ecosystems. The project also supports undergraduate research, and includes recruitment efforts directed at students from underrepresented groups.
In previous work on data generated by a deterministic population model, the investigators measured changes in topological features (via cubical homology) of population distribution patterns en route to extinction, and observed clear topological signatures of impending collapse. Results with the deterministic model serve as a proof of concept, but in this project, the investigators will study dynamical changes in stochastic population models and real ecological data sets. Transitioning from deterministic to stochastic systems will require substantial development of the methodology, and will require the use of more sophisticated tools, e.g., multiparameter persistent homology. The developed methodology must be able to detect signal in noisy data, corrupted data, missing data, and data that is sparse in space and/or time. Because the topological approach can distinguish fine-scale stochastic noise from large-scale deterministic spatial patterns, it is a promising tool for the analysis of noisy ecological data, and preliminary work using multiparameter persistence shows that it is capable of recovering ""true? dynamical signal (a population distribution pattern) from noise.
This project is jointly funded by the Mathematical Biology program of the Division of Mathematical Sciences (DMS) in the Directorate for Mathematical and Physical Sciences (MPS), the Established Program to Stimulate Competitive Research (EPSCoR), and the Population and Community Ecology Cluster (PEC) of the Division of Environmental Biology (DEB) in the Directorate for Biological Sciences (BIO).
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2327893","Collaborative Research: RUI: Topological methods for analyzing shifting patterns and population collapse","DMS","Population & Community Ecology, MATHEMATICAL BIOLOGY","02/01/2024","01/25/2024","Sarah Day","VA","College of William and Mary","Standard Grant","Zhilan Feng","01/31/2027","$192,542.00","","sday@math.wm.edu","1314 S MOUNT VERNON AVE","WILLIAMSBURG","VA","231852817","7572213965","MPS","112800, 733400","068Z, 9150, 9229","$0.00","Profound and irreversible changes in ecosystems, such as population collapse, are occurring globally due to climate change, habitat destruction, and overuse of natural resources, and are only expected to become more frequent in the future. To prevent an impending collapse, we must recognize the early warning signs. This is particularly challenging in ecological systems due to their naturally complex behavior in both space and time, as well as noisy and/or poorly resolved data. In this project, the investigators will use a novel approach for early detection of impending population collapse, and apply the methodology to spatially distributed populations, for example, a grassland. They utilize a method called computational topology, which can quantify features of a population distribution pattern, such as the level of patchiness in the pattern. In previous work, the investigators used a spatial population model to quantify the changes in a population distribution pattern that occurred as the population went extinct and observed a ""topological route to extinction"". In this project, the investigators will develop and extend the methodology for use in stochastic population models and real-world data sets, which are expected to contain high levels of noise and/or missing/corrupted data. The developed methodology will serve as an additional tool for the prediction of impending population collapse. This tool can then be used by conservation biologists and natural resource managers in order to assist in preserving vulnerable species and ecosystems. The project also supports undergraduate research, and includes recruitment efforts directed at students from underrepresented groups.
In previous work on data generated by a deterministic population model, the investigators measured changes in topological features (via cubical homology) of population distribution patterns en route to extinction, and observed clear topological signatures of impending collapse. Results with the deterministic model serve as a proof of concept, but in this project, the investigators will study dynamical changes in stochastic population models and real ecological data sets. Transitioning from deterministic to stochastic systems will require substantial development of the methodology, and will require the use of more sophisticated tools, e.g., multiparameter persistent homology. The developed methodology must be able to detect signal in noisy data, corrupted data, missing data, and data that is sparse in space and/or time. Because the topological approach can distinguish fine-scale stochastic noise from large-scale deterministic spatial patterns, it is a promising tool for the analysis of noisy ecological data, and preliminary work using multiparameter persistence shows that it is capable of recovering ""true? dynamical signal (a population distribution pattern) from noise.
This project is jointly funded by the Mathematical Biology program of the Division of Mathematical Sciences (DMS) in the Directorate for Mathematical and Physical Sciences (MPS), the Established Program to Stimulate Competitive Research (EPSCoR), and the Population and Community Ecology Cluster (PEC) of the Division of Environmental Biology (DEB) in the Directorate for Biological Sciences (BIO).
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
diff --git a/Statistics/Awards-Statistics-2024.csv b/Statistics/Awards-Statistics-2024.csv
index 16f8669..c0b5706 100644
--- a/Statistics/Awards-Statistics-2024.csv
+++ b/Statistics/Awards-Statistics-2024.csv
@@ -1,10 +1,11 @@
"AwardNumber","Title","NSFOrganization","Program(s)","StartDate","LastAmendmentDate","PrincipalInvestigator","State","Organization","AwardInstrument","ProgramManager","EndDate","AwardedAmountToDate","Co-PIName(s)","PIEmailAddress","OrganizationStreet","OrganizationCity","OrganizationState","OrganizationZip","OrganizationPhone","NSFDirectorate","ProgramElementCode(s)","ProgramReferenceCode(s)","ARRAAmount","Abstract"
+"2330089","Scalable Entity Resolution for Massive and Streaming Data Contexts","DMS","STATISTICS","09/01/2024","08/08/2024","Andrea Kaplan","CO","Colorado State University","Continuing Grant","Tapabrata Maiti","08/31/2027","$64,072.00","","akaplan@colostate.edu","601 S HOWES ST","FORT COLLINS","CO","805212807","9704916355","MPS","126900","9263","$0.00","This research project will develop the first model-based statistical approach to perform entity
resolution for streaming data contexts and will address the issue of scalability in both online and
offline scenarios. With the ubiquity of data, linking multiple data sets is a crucial first step in
many types of inference for myriad applications including healthcare, official statistics, ecology,
and fraud detection and national security. Entity resolution is the task of resolving duplicates
in two or more partially overlapping sets of records, or files, from noisy data sources without a
unique identifier. Statistical approaches to entity resolution are advantageous because they provide
interpretable parameters and quantify uncertainty in the linked records. Linking becomes more
challenging when the data update over time, termed streaming or online data, or when the data
scale is massive. Currently no statistically model-based approaches exist to resolve entities in an
online way. Relatedly, the nature of streaming data exacerbates the challenge of scalability in
that the number of records to be linked accumulates. The methods will be made accessible to
practitioners and other researchers through open-source software and the project will additionally
provide educational and professional training and mentoring to graduate students.
This project will expand model-based entity resolution into the streaming data space through
the formulation of new models and novel computational algorithms. Specifically, this project aims to
improve scalability of Bayesian entity resolution models through approximate sampling techniques,
such as variational inference, and develop fast updating of a Bayesian entity resolution model in a
streaming data context, resulting in the first Bayesian entity resolution model updating strategy
that can handle streaming data contexts for massive data sets. A limitation of existing methods for
streaming inference with Bayesian models is that the pool of samples to be updated will converge to
a degenerate distribution as the process is repeated many times, which guarantees poor quality of
model fits in a streaming setting. This issue will be addressed with introduction of a novel Markov
chain Monte Carlo sampler for streaming Bayesian inference, which will improve existing methods
by combining filtering ideas with a highly parallelizable transition kernel.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2413812","Modeling and Inference for Long-Term Effects","DMS","STATISTICS","08/15/2024","08/06/2024","Yaqi Duan","NY","New York University","Continuing Grant","Yong Zeng","07/31/2027","$39,839.00","","yd2878@stern.nyu.edu","70 WASHINGTON SQ S","NEW YORK","NY","100121019","2129982121","MPS","126900","075Z, 079Z","$0.00","Recent advancements in data-driven decision-making have transformed fields such as healthcare and digital marketing. These areas now prioritize understanding the long-term consequences of actions and policies. For example, drug design aims to mitigate long-term side effects without compromising immediate efficacy; online recommender systems like Netflix aim to improve long-term customer benefits while balancing them with short-term engagement metrics. Evaluating long-term effects is challenging due to the dynamic nature of environments and the cumulative uncertainty of future predictions, especially in real-life scenarios that require prompt decisions. This grant supports research to develop innovative statistical inference methods and models for estimating long-term effects efficiently and effectively. The PI will integrate research and education by involving graduate students in the research and incorporating findings into mini-courses at workshops. The project will also provide mentoring and support for URM graduate students and postdocs, fostering a diverse and inclusive research community.
In more detail, this project proposes several research thrusts that provide various models to capture long-term effects in real-life scenarios. The first thrust focuses on environments with time-homogeneous transitions, assuming a Markovian framework. The main goal is to use system observations to understand dynamics, establish robust estimators, and quantify uncertainty. The methods are expected to handle distributional shifts in data, misspecification in function approximation, and the potential high-dimensionality in models. The second thrust concerns non-stationary dynamic systems. Challenges include determining the change points as the system evolves, selecting the most useful and related data, and using an appropriate surrogate index approach to form a valid estimate. On top of the first two thrusts, the third one involves integrating multiple datasets to facilitate estimation. The goal is to develop methods that combine relevant but non-identical data sources effectively to mitigate the issue of data scarcity.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413875","Statistical Inference with Strategic Agents: Accounting for Incentives and Information Asymmetry","DMS","STATISTICS","08/15/2024","08/02/2024","Martin Wainwright","MA","Massachusetts Institute of Technology","Standard Grant","Yong Zeng","07/31/2027","$375,000.00","Stephen Bates","wainwrigwork@gmail.com","77 MASSACHUSETTS AVE","CAMBRIDGE","MA","021394301","6172531000","MPS","126900","075Z, 079Z","$0.00","This project aims to advance the understanding of statistical analysis within a broader environment with multiple stakeholders. In particular, the results of statistical hypothesis tests are often used to make yes/no decisions (e.g., whether to release a drug candidate to the public) that affect multiple parties. The decision materially impacts the party who developed the drug and public health generally, but in different ways, and this may impact how the stakeholders interact (e.g. when they choose to release data). We seek to develop statistical protocols that are robust to the behavior of different stakeholders who may have different aims. Progress on this front will lead to more reliable conclusions from data that are collected by multiple parties, contributing to the trustworthiness of the scientific enterprise. Moreover, the project is interdisciplinary in nature, bringing together ideas from statistics and also economics and social science. Fostering such connectivity is helpful for both fields, and such links are also useful educationally. The project also provides research training opportunities for students and postdocs.
The technical goal of this proposal is to study statistical inference in settings that involve both incentive structures because actions and statistical decisions affect the behavior of other parties, and information asymmetry, as one party may have private information not available to others. In order to bring sharp focus to the core issues, this project tackles this challenge within a particular model of interaction known as the principal-agent model. As opposed to standard game-theoretic analyses of such interactions, the PIs study a statistical version of the principal-agent model, wherein the statistician plays the role of the principal. The principal has the goal of carrying out some form of statistical inference. In order to do so, the principal can interact with one or more agents that can provide statistically relevant information (e.g., a drug to test in a clinical trial, a feature set, datasets of varying and uncertain quality). Viewed as a two-person game, the principal moves first by specifying a statistical protocol, along with some kind of payment structure associated with it. The agent then makes its decision, which the PIs model it in terms of expected utility maximization. Our primary goal is to develop methodology and theory for hypothesis testing in this interactive setting.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413741","Design-Based Subsampling for Labeling Large and High-Dimensional Datasets","DMS","STATISTICS","09/01/2024","08/02/2024","Lin Wang","IN","Purdue University","Continuing Grant","Yong Zeng","08/31/2027","$47,255.00","","linwang@purdue.edu","2550 NORTHWESTERN AVE # 1100","WEST LAFAYETTE","IN","479061332","7654941055","MPS","126900","","$0.00","Data labeling, the process of assigning labels or annotations to data points, is crucial in supervised machine learning for training models to make accurate predictions in various applications. Labels refer to the output variables that the machine learning model aims to predict or classify. For instance, in genetic and genomic studies, labels may refer to traits or the presence of diseases, and accurate data labeling is essential for training models to understand the relationships between genetic information and various traits or diseases. The labeling process is resource-intensive, requiring domain expertise, advanced experimentation techniques, and rigorous quality control to ensure accuracy. Consequently, labeling all data points in a large dataset is often impractical due to resource limitations. Therefore, selecting an informative subsample from the large pool of data points to label becomes a critical and challenging problem. This project aims to develop subsampling methods for labeling large and high-dimensional datasets. The anticipated results will be applicable to genetics, biology, and medicine. Graduate and undergraduate students will be involved in the project and exposed to these results, which will also be incorporated into university courses.
The project aims to develop advanced subsampling techniques for data labeling, particularly for large and high-dimensional datasets. Optimal subsampling approaches for both continuous and binary labels will be developed to enhance the predictive performance of models trained on the labeled subsample. Additionally, sequential sampling plans will be investigated. The project also emphasizes fairness in the trained models, striving to develop subsampling methods that ensure a balanced representation of multiple demographic groups in the labeled data, achieving consistent accuracy across all demographic groups. This project will contribute substantially to advancing subsampling methodologies in statistics and data science.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
-"2413701","Advancing Nonparametric Online Inference: Optimal Uncertainty Quantification and Decision-Making for Streaming Data","DMS","STATISTICS","08/15/2024","07/31/2024","Meimei Liu","VA","Virginia Polytechnic Institute and State University","Continuing Grant","Yong Zeng","07/31/2027","$50,000.00","","meimeiliu@vt.edu","300 TURNER ST NW","BLACKSBURG","VA","240603359","5402315281","MPS","126900","075Z, 079Z","$0.00","This research project will enhance data science by developing new methodologies for uncertainty quantification and decision-making in large-scale, streaming data with complex structures. These novel statistical tools will improve real-time analysis in fields such as mobile health, infectious disease surveillance, and neuroscience. In healthcare, the developed methods will advance diagnostic accuracy and treatment precision across various medical conditions like diabetes and Alzheimer's disease, enabling timely interventions and personalized care strategies based on real-time data analysis. Additionally, the project will integrate its research into K-12 educational programs and offer training opportunities for graduate and undergraduate students, focusing on engaging underrepresented groups in STEM. This initiative aims to cultivate the next generation of data scientists and statisticians, equipping them with the vital skills needed to address future challenges in data-driven fields.
The research will establish a unified framework for nonparametric online statistical inference using an online multiplier bootstrap approach combined with functional stochastic gradient descent (SGD) algorithms. This framework will include local and global confidence intervals and bands, pattern and signal detection via hypothesis testing, and real-time decision-making strategies for nonparametric regressions. These methods will be applicable to various data scenarios, from independent to dependent data. The project will characterize the non-asymptotic behavior of the functional SGD estimator, validate the consistency of the multiplier bootstrap method, establish honest confidence bands, and demonstrate minimax optimal testing consistency of the proposed inference tools. By developing a solid foundation with accompanying software for nonparametric online inference, this research will advance methodologies in online data-driven decision-making, with broad applications ranging from mobile health to financial markets.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
-"2413812","Modeling and Inference for Long-Term Effects","DMS","STATISTICS","08/15/2024","08/06/2024","Yaqi Duan","NY","New York University","Continuing Grant","Yong Zeng","07/31/2027","$39,839.00","","yd2878@stern.nyu.edu","70 WASHINGTON SQ S","NEW YORK","NY","100121019","2129982121","MPS","126900","075Z, 079Z","$0.00","Recent advancements in data-driven decision-making have transformed fields such as healthcare and digital marketing. These areas now prioritize understanding the long-term consequences of actions and policies. For example, drug design aims to mitigate long-term side effects without compromising immediate efficacy; online recommender systems like Netflix aim to improve long-term customer benefits while balancing them with short-term engagement metrics. Evaluating long-term effects is challenging due to the dynamic nature of environments and the cumulative uncertainty of future predictions, especially in real-life scenarios that require prompt decisions. This grant supports research to develop innovative statistical inference methods and models for estimating long-term effects efficiently and effectively. The PI will integrate research and education by involving graduate students in the research and incorporating findings into mini-courses at workshops. The project will also provide mentoring and support for URM graduate students and postdocs, fostering a diverse and inclusive research community.
In more detail, this project proposes several research thrusts that provide various models to capture long-term effects in real-life scenarios. The first thrust focuses on environments with time-homogeneous transitions, assuming a Markovian framework. The main goal is to use system observations to understand dynamics, establish robust estimators, and quantify uncertainty. The methods are expected to handle distributional shifts in data, misspecification in function approximation, and the potential high-dimensionality in models. The second thrust concerns non-stationary dynamic systems. Challenges include determining the change points as the system evolves, selecting the most useful and related data, and using an appropriate surrogate index approach to form a valid estimate. On top of the first two thrusts, the third one involves integrating multiple datasets to facilitate estimation. The goal is to develop methods that combine relevant but non-identical data sources effectively to mitigate the issue of data scarcity.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2432168","Conference: Interagency Analysis and Modeling Group/Multiscale Modeling Consortium (IMAG/MSM) Meeting on Operationalizing the NASEM Report on Digital Twins","DMS","STATISTICS, COMPUTATIONAL MATHEMATICS, MATHEMATICAL BIOLOGY","08/15/2024","08/02/2024","Gary An","VT","University of Vermont & State Agricultural College","Standard Grant","Troy D. Butler","07/31/2025","$30,000.00","","gan@med.med.edu","85 S PROSPECT STREET","BURLINGTON","VT","054051704","8026563660","MPS","126900, 127100, 733400","075Z, 079Z, 7556, 9150","$0.00","On December 15, 2023, The National Academies of Sciences, Engineering and Medicine (NASEM) released a report entitled: ?Foundational Research Gaps and Future Directions for Digital Twins? (?NASEM DT REPORT?). The purpose of this report was to bring structure to the burgeoning field of digital twins by providing a working definition and a series of research challenges that need to be addressed to allow this technology to fulfill its full potential. The concept of digital twins is compelling and has the potential to impact a broad range of domains. For instance, digital twins have either been proposed or are currently being developed for manufactured devices, buildings, cities, ecologies and the Earth as a whole. It is natural that the concept be applied to biology and medicine, as the most recognizable concept of a ?twin? is that of identical human twins. The application of digital twins to biomedicine also follows existing trends of Personalized and Precision medicine, in short: ?the right treatment for the right person at the right time.? Fulfilling the promise of biomedical digital twins will require multidisciplinary Team Science that brings together various experts from fields as diverse as medicine, computer science, engineering, biological research, advanced mathematics and ethics. The purpose of this conference, the ?2024 Interagency Modeling and Analysis Group (IMAG)/Multiscale Modeling (MSM) Consortium Meeting: Setting up Teams for Biomedical Digital Twins,? is to do exactly this: bringing together such needed expertise in a series of teaming exercises to operationalize the findings of the NASEM DT REPORT in the context of biomedical digital twins. As part of outreach and training efforts to broaden the participation within this growing field, this workshop will provide support for both traditionally under-represented categories of senior researchers as well as junior researchers such as graduate students and postdoctoral researchers.
Facilitating the development and deployment of biomedical digital twins requires operationalizing the findings and recommendations of the NASEM DT REPORT, which raises a series of specific and unique challenges in the biomedical domain. More specifically, there are numerous steps that need to be taken to convert the highly complex simulation models of biological processes developed by members of the MSM Consortium into biomedical digital twins that are compliant with the definition of digital twins presented in the NASEM DT REPORT. There are also identified challenges associated with these various steps. Some of these challenges can benefit from lessons learned in other domains that have developed digital twins while others will require the development of new techniques in the fields of statistics, computational mathematics and mathematical biology. This task will require multidisciplinary collaborations between mathematicians, computational researchers, experimental biologists and clinicians. This IMAG/MSM meeting will promote the concepts of Team Science to bring together experienced multiscale modeling researchers and experts from the mathematical, statistical, computational, experimental and clinical communities to form the multidisciplinary teams needed to operationalize the findings of the NASEM DT REPORT. The website for this meeting is at https://www.imagwiki.nibib.nih.gov/news-events/announcements/2024-imagmsm-meeting-september-30-october-2-2024, with the landing page for the Interagency Modeling and Analysis Group at https://www.imagwiki.nibib.nih.gov/.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413610","Subsampling Based Inference for Large Networks","DMS","STATISTICS","08/15/2024","08/02/2024","Yuguo Chen","IL","University of Illinois at Urbana-Champaign","Standard Grant","Yulia Gel","07/31/2027","$150,000.00","","yuguo@uiuc.edu","506 S WRIGHT ST","URBANA","IL","618013620","2173332187","MPS","126900","","$0.00","Large-scale networks are being generated in many scientific fields, including biological sciences, social sciences, and physical sciences. The project addresses the urgent need to develop scalable subsampling algorithms for statistical inference on such networks and provide theoretical guarantees on the performance. The proposed methods will be applied to biological and social network data, and will be used to study mindfulness-based therapies for disorders associated with hearing loss, such as tinnitus. The project offers opportunities for involvement of graduate and undergraduate students with diverse backgrounds and interests. The proposed methods will be incorporated into relevant courses. Research results will be disseminated to the scientific communities, and all software developed in this research will be freely distributed as open-source to the public.
The project will develop subsampling based methods for inference problems, such as model selection and hypothesis testing, for large-scale networks, and investigate theoretical properties of these methods to provide statistical guarantees on performance. The subsampling strategies will be applied to a broad range of models for networks, including stochastic block models, random dot product graph models, latent space models, and other models for networks. The theoretical properties of subsampling methods investigated in this project include the consistency of model selection, hypothesis testing, and parameter estimation. The proposed subsampling methods will be applied to real network data from social and natural sciences.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2413701","Advancing Nonparametric Online Inference: Optimal Uncertainty Quantification and Decision-Making for Streaming Data","DMS","STATISTICS","08/15/2024","07/31/2024","Meimei Liu","VA","Virginia Polytechnic Institute and State University","Continuing Grant","Yong Zeng","07/31/2027","$50,000.00","","meimeiliu@vt.edu","300 TURNER ST NW","BLACKSBURG","VA","240603359","5402315281","MPS","126900","075Z, 079Z","$0.00","This research project will enhance data science by developing new methodologies for uncertainty quantification and decision-making in large-scale, streaming data with complex structures. These novel statistical tools will improve real-time analysis in fields such as mobile health, infectious disease surveillance, and neuroscience. In healthcare, the developed methods will advance diagnostic accuracy and treatment precision across various medical conditions like diabetes and Alzheimer's disease, enabling timely interventions and personalized care strategies based on real-time data analysis. Additionally, the project will integrate its research into K-12 educational programs and offer training opportunities for graduate and undergraduate students, focusing on engaging underrepresented groups in STEM. This initiative aims to cultivate the next generation of data scientists and statisticians, equipping them with the vital skills needed to address future challenges in data-driven fields.
The research will establish a unified framework for nonparametric online statistical inference using an online multiplier bootstrap approach combined with functional stochastic gradient descent (SGD) algorithms. This framework will include local and global confidence intervals and bands, pattern and signal detection via hypothesis testing, and real-time decision-making strategies for nonparametric regressions. These methods will be applicable to various data scenarios, from independent to dependent data. The project will characterize the non-asymptotic behavior of the functional SGD estimator, validate the consistency of the multiplier bootstrap method, establish honest confidence bands, and demonstrate minimax optimal testing consistency of the proposed inference tools. By developing a solid foundation with accompanying software for nonparametric online inference, this research will advance methodologies in online data-driven decision-making, with broad applications ranging from mobile health to financial markets.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2425995","Conference: The SIAM Quantum Intersections Convening","DMS","FET-Fndtns of Emerging Tech, OFFICE OF MULTIDISCIPLINARY AC, INFRASTRUCTURE PROGRAM, APPLIED MATHEMATICS, TOPOLOGY, FOUNDATIONS, STATISTICS, QIS - Quantum Information Scie, MATHEMATICAL BIOLOGY","08/01/2024","07/31/2024","Suzanne Weekes","PA","Society For Industrial and Applied Math (SIAM)","Standard Grant","Tomek Bartoszynski","07/31/2025","$349,996.00","","weekes@siam.org","3600 MARKET ST FL 6","PHILADELPHIA","PA","191042669","2153829800","MPS","089Y00, 125300, 126000, 126600, 126700, 126800, 126900, 728100, 733400","7203, 7556","$0.00","Society for Industrial and Applied Mathematics (SIAM) will host the SIAM Quantum Intersections Convening - Integrating Mathematical Scientists into Quantum Research to bring quantum-curious mathematical scientists together with leading experts in quantum science for a three-day interactive workshop. Recognizing the critical role of mathematical scientists, this convening aims to promote multidisciplinary collaborations that bridge the gap between mathematics and quantum sciences and aims to foster and increase the involvement and visibility of mathematicians and statisticians in quantum science research and education. The convening will be organized by a steering committee and will be supported by professional facilitators. Participants will learn from and connect with physicists, computer scientists, engineers and mathematical scientists who are experts in quantum science. This in-person gathering will be held in fall 2024 in the Washington DC area. A primary deliverable from the convening will be a report summarizing the activities and recommendations generated during the event. Key presentations will be recorded and will be available on a SIAM webpage.
Society for Industrial and Applied Mathematics (SIAM) will host this convening with the goals of (i) making more mathematical scientists aware of the demand for their expertise in quantum research and articulating areas and problems where they can contribute, (ii) increasing the participation of researchers in mathematical sciences in the quantum information science revolution to accelerate its research and development, (iii) providing a seeding ground for partnerships and collaborations of mathematical scientists with physicists, computer scientists, and engineers from industry and academia, and (iv) recommending activities to develop a quantum science and technology workforce pipeline in the mathematical and computational sciences. A few topics in quantum science where mathematics can help research and discovery include quantum computing, quantum algorithms, quantum optimization, quantum error corrections, quantum information theory, quantum cryptography, quantum sensing and metrology, and quantum networks.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413243","A Statistical Foundation of In-Context Learning and Chain-of-Thought Prompting with Large Language Models","DMS","STATISTICS","08/01/2024","07/24/2024","Zhuoran Yang","CT","Yale University","Continuing Grant","Tapabrata Maiti","07/31/2027","$80,882.00","","zhuoran.yang@yale.edu","150 MUNSON ST","NEW HAVEN","CT","065113572","2037854689","MPS","126900","1269","$0.00","Large Language Models (LLMs) like GPT-4 have transformed natural language processing and related fields by demonstrating unprecedented capabilities in interpreting human instructions and completing complex reasoning tasks. Representing a paradigm shift in statistical machine learning, these models are trained on extensive text corpora and can perform novel tasks without modifications to their parameters. This project seeks to develop a comprehensive theoretical framework to understand mainstream methods used with deployed LLMs through a statistical lens. The anticipated broader impacts of this research include enriching educational curricula at participating institutions and providing significant training opportunities for both graduate and undergraduate students. Moreover, the project's outcomes are expected to enhance high-impact applications in sectors such as robotics and transportation systems, thereby improving the practical deployment of LLMs in complex decision-making scenarios.
Specifically, this research explores the statistical foundations of various prompting methods utilized in LLMs, including in-context learning (ICL) and chain-of-thought (CoT) prompting. The study is organized around three main thrusts: first, deciphering how LLMs perform ICL and CoT as forms of implicit Bayesian inference and understanding how transformer architectures' attention mechanisms approximately encode these Bayesian estimators. Second, the project will develop algorithms to analyze the statistical errors?incurred during pre-training and prompting stages ?associated with these prompting-based estimators. The third thrust aims to apply this theoretical framework to real-world applications like robotic control and autonomous driving, formulating principled methods that utilize pre-trained LLMs for complex decision-making. By establishing a robust statistical foundation for prompting-based methodologies, this research aims to advance the field of prompt engineering and contribute to the development of principled methods for using LLMs.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413721","New Directions in Bayesian Heterogeneous Data Integration: Methods, Theory and Applications","DMS","STATISTICS","07/01/2024","06/17/2024","Sharmistha Guha","TX","Texas A&M University","Continuing Grant","Tapabrata Maiti","06/30/2027","$49,899.00","","sharmistha@tamu.edu","400 HARVEY MITCHELL PKY S STE 30","COLLEGE STATION","TX","778454375","9798626777","MPS","126900","1269","$0.00","As the scientific community is moving into a data-driven era, there is an unprecedented opportunity for the integrative analysis of network and functional data from multiple sources to uncover important scientific insights which might be missing when these data sources are analyzed in isolation. To this end, this project plans to transform the current landscape of integrating network and functional data, leveraging their combined strength for scientific advancements through the development of innovative hierarchical Bayesian statistical models. The proposed work holds transformative promise in vital scientific domains, such as cognitive and motor aging, and neurodegenerative diseases. It will enhance scientific collaborations with neuroscientists using multi-source image data for targeted investigations of key brain regions significant in the study of motor and cognitive aging. Moreover, the proposed research will facilitate the prediction of images, traditionally acquired via costly imaging modalities, utilizing images from more cost-effective alternatives, which is poised to bring about transformative changes in the healthcare economy. The open-source software and educational materials created will be maintained and accessible to a wider audience of statisticians and domain experts. This accessibility is anticipated to foster widespread adoption of these techniques among statisticians and domain scientists. The PI's involvement in conference presentations, specialized course development, curriculum expansion, graduate student mentoring, undergraduate research engagement with a focus on under-represented backgrounds, and provision of short courses will enhance dissemination efforts and encourage diverse utilization of the developed methods.
The proposed project aims to address the urgent need for principled statistical approaches to seamlessly merge information from diverse sources, including modern network and functional data. It challenges the prevailing trend of analyzing individual data sources, which inherently limits the potential for uncovering innovative scientific insights that could arise from integrating multiple sources. Hierarchical Bayesian models are an effective way to capture the complex structures in network and functional data. These models naturally share information among heterogeneous objects, providing comprehensive uncertainty in inference through science-driven joint posterior distributions. Despite the potential advantages of Bayesian perspectives, their widespread adoption is hindered by the lack of theoretical guarantees, computational challenges, and difficulties in specifying robust priors for high-dimensional problems. This proposal will address these limitations by integrating network and functional data, leveraging their combined strength for scientific advancements through the development of innovative hierarchical Bayesian models. Specifically, the project will develop a semi-parametric joint regression framework with network and functional responses, deep network regression with multiple network responses, and Bayesian interpretable deep neural network regression with functional response on network and functional predictors. Besides offering a novel toolbox for multi-source object data integration, the proposed approach will advance the emerging field of interpretable deep learning for object regression by formulating novel and interpretable deep neural networks that combine predictive power with statistical model interpretability. The project will develop Bayesian asymptotic results to guarantee accurate parametric and predictive inference from these models as a function of network and functional features and sample size, an unexplored domain in the Bayesian integration of multi-object data. The proposed methodology will significantly enhance the seamless integration of multimodal neuroimaging data, leading to principled inferences and deeper comprehension of brain structure and function in the study of Alzheimer's disease and aging.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
@@ -49,10 +50,10 @@
"2412832","Collaborative Research: Statistical Modeling and Inference for Object-valued Time Series","DMS","STATISTICS","07/01/2024","06/17/2024","Changbo Zhu","IN","University of Notre Dame","Continuing Grant","Jun Zhu","06/30/2027","$56,755.00","","czhu4@nd.edu","836 GRACE HALL","NOTRE DAME","IN","465566031","5746317432","MPS","126900","","$0.00","Random objects in general metric spaces have become increasingly common in many fields. For example, the intraday return path of a financial asset, the age-at-death distributions, the annual composition of energy sources, social networks, phylogenetic trees, and EEG scans or MRI fiber tracts of patients can all be viewed as random objects in certain metric spaces. For many endeavors in this area, the data being analyzed is collected with a natural ordering, i.e., the data can be viewed as an object-valued time series. Despite its prevalence in many applied problems, statistical analysis for such time series is still in its early development. A fundamental difficulty of developing statistical techniques is that the spaces where these objects live are nonlinear and commonly used algebraic operations are not applicable. This research project aims to develop new models, methodology and theory for the analysis of object-valued time series. Research results from the project will be disseminated to the relevant scientific communities via publications, conference and seminar presentations. The investigators will jointly mentor a Ph.D. student and involve undergraduate students in the research, as well as offering advanced topic courses to introduce the state-of-the-art techniques in object-valued time series analysis.
The project will develop a systematic body of methods and theory on modeling and inference for object-valued time series. Specifically, the investigators propose to (1) develop a new autoregressive model for distributional time series in Wasserstein geometry and a suite of tools for model estimation, selection and diagnostic checking; (2) develop new specification testing procedures for distributional time series in the one-dimensional Euclidean space; and (3) develop new change-point detection methods to detect distribution shifts in a sequence of object-valued time series. The above three projects tackle several important modeling and inference issues in the analysis of object-valued time series, the investigation of which will lead to innovative methodological and theoretical developments, and lay groundwork for this emerging field.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2412403","Robust Extensions to Bayesian Regression Trees for Complex Data","DMS","STATISTICS","08/01/2024","06/17/2024","HENGRUI LUO","TX","William Marsh Rice University","Continuing Grant","Tapabrata Maiti","07/31/2027","$58,710.00","","hl180@rice.edu","6100 MAIN ST","Houston","TX","770051827","7133484820","MPS","126900","","$0.00","This project is designed to extend the capabilities of tree-based models within the context of machine learning. Tree-based models allow for decision-making based on clear, interpretable rules and are widely adopted in diagnostic and learning tasks. This project will develop novel methodologies to enhance their robustness. Specifically, the research will integrate deep learning techniques with tree-based statistical methods to create models capable of processing complex, high-dimensional data from medical imaging, healthcare, and AI sectors. These advancements aim to significantly improve prediction and decision-making processes, enhancing efficiency and accuracy across a broad range of applications. The project also prioritizes inclusivity and education by integrating training components, thereby advancing scientific knowledge and disseminating results through publications and presentations.
The proposed research leverages Bayesian hierarchies and transformation techniques on trees to develop models capable of managing complex transformations of input data. These models will be tailored to improve interpretability, scalability, and robustness, overcoming current limitations in non-parametric machine learning applications. The project will utilize hierarchical layered structures, where outputs from one tree serve as inputs to subsequent trees, forming network architectures that enhance precision in modeling complex data patterns and relationships. Bayesian techniques will be employed to effectively quantify uncertainty and create ensembles, providing reliable predictions essential for critical offline prediction and real-time decision-making processes. This initiative aims to develop pipelines and set benchmarks for the application of tree-based models across diverse scientific and engineering disciplines.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2412015","Statistical methods for point-process time series","DMS","STATISTICS","07/01/2024","06/17/2024","Daniel Gervini","WI","University of Wisconsin-Milwaukee","Standard Grant","Jun Zhu","06/30/2027","$149,989.00","","gervini@uwm.edu","3203 N DOWNER AVE # 273","MILWAUKEE","WI","532113188","4142294853","MPS","126900","","$0.00","This research project will develop statistical models and inference methods for the analysis of random point processes. Random point processes are events that occur at random in time or space according to certain patterns; this project will provide methods for the discovery and analysis of such patterns. Examples of events that can be modelled as random point processes include cyberattacks on a computer network, earthquakes, crimes in a city, spikes of neural activity in humans and animals, car crashes in a highway, and many others. Therefore, the methods to be developed under this project will find applications in many fields, such as national security, economy, neuroscience and geosciences, among others. The project will also provide training opportunities for graduate and undergraduate students in the field of Data Science.
This project will specifically develop statistical tools for the analysis of time series of point processes, that is, for point processes that are observed repeatedly over time; for example, when the spatial distribution of crime in a city is observed for several days. These tools will include trend estimation methods, autocorrelation estimation methods, and autoregressive models. Research activities in this project include the development of parameter estimation procedures, their implementation in computer programs, the study of theoretical large sample properties of these methods, the study of small sample properties by simulation, and their application to real-data problems. Other activities in this project include educational activities, such as the supervision of Ph.D. and Master's students, and the development of graduate and undergraduate courses in Statistics and Data Science.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
-"2412628","Collaborative Research: Partial Priors, Regularization, and Valid & Efficient Probabilistic Structure Learning","DMS","STATISTICS","07/01/2024","06/17/2024","Ryan Martin","NC","North Carolina State University","Standard Grant","Yulia Gel","06/30/2027","$160,000.00","","rgmarti3@ncsu.edu","2601 WOLF VILLAGE WAY","RALEIGH","NC","276950001","9195152444","MPS","126900","1269","$0.00","Modern applications of statistics aim to solve complex scientific problems involving high-dimensional unknowns. One feature that these applications often share is that the high-dimensional unknown is believed to satisfy a complexity-limiting, low-dimensional structure. Specifics of the posited low-dimensional structure are mostly unknown, so a statistically interesting and scientifically relevant problem is structure learning, i.e., using data to learn the latent low-dimensional structure. Because structure learning problems are ubiquitous and reliable uncertainty quantification is imperative, results from this project will have an impact across the biomedical, physical, and social sciences. In addition, the project will offer multiple opportunities for career development of new generations of statisticians and data scientists.
Frequentist methods focus on data-driven estimation or selection of a candidate structure, but currently there are no general strategies for reliable uncertainty quantification concerning the unknown structure. Bayesian methods produce a data-dependent probability distribution over the space of structures that can be used for uncertainty quantification, but it comes with no reliability guarantees. A barrier to progress in reliable uncertainty quantification is the oppositely extreme perspectives: frequentists' anathema of modeling structural/parametric uncertainty versus Bayesians' insistence that such uncertainty always be modeled precisely and probabilistically. Overcoming this barrier requires a new perspective falling between these two extremes, and this project will develop a new framework that features a more general and flexible perspective on probability, namely, imprecise probability. Most importantly, this framework will resolve the aforementioned issues by offering new and powerful methods boasting provably reliable uncertainty quantification in structure learning applications.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2412408","Monitoring time series in structured function spaces","DMS","STATISTICS","07/01/2024","06/14/2024","Piotr Kokoszka","CO","Colorado State University","Standard Grant","Yulia Gel","06/30/2027","$292,362.00","","Piotr.Kokoszka@colostate.edu","601 S HOWES ST","FORT COLLINS","CO","805212807","9704916355","MPS","126900","1269","$0.00","This project aims to develop new mathematical theory and statistical tools that will enable monitoring for changes in complex systems, for example global trade networks. Comprehensive databases containing details of trade between almost all countries are available. Detecting in real time a change in the typical pattern of trade and identifying countries where this change takes place is an important problem. This project will provide statistical methods that will allow making decisions about an emergence of an atypical pattern in a complex system in real time with certain theoretical guarantees. The project will also offer multiple interdisciplinary training opportunities for the next generation of statisticians and data scientists.
The methodology that will be developed is related to sequential change point detection, but is different because the in-control state is estimated rather than assumed. This requires new theoretical developments because it deals with complex infinite dimensional systems, whereas existing mathematical tools apply only to finite-dimensional systems. Panels of structured functions will be considered and methods for on-line identification of components undergoing change will be devised. All methods will be inferential with controlled probabilities of type I errors. Some of the key aspects of the project can be summarized in the following points. First, statistical theory leading to change point monitoring schemes in infinite dimensional function spaces will be developed. Second, strong approximations valid in Banach spaces will lead to assumptions not encountered in scalar settings and potentially to different threshold functions. Third, for monitoring of random density functions, the above challenges will be addressed in custom metric spaces. Fourth, since random densities are not observable, the effect of estimation will be incorporated. The new methodology will be applied to viral load measurements, investment portfolios, and global trade data.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413952","Collaborative Research: Statistical Inference for High Dimensional and High Frequency Data: Contiguity, Matrix Decompositions, Uncertainty Quantification","DMS","STATISTICS","07/01/2024","06/21/2024","Per Mykland","IL","University of Chicago","Standard Grant","Jun Zhu","06/30/2027","$219,268.00","","mykland@galton.uchicago.edu","5801 S ELLIS AVE","CHICAGO","IL","606375418","7737028669","MPS","126900","","$0.00","To pursue the promise of the big data revolution, the current project is concerned with a particular form of such data, high dimensional high frequency data (HD2), where series of high-dimensional observations can see new data updates in fractions of milliseconds. With technological advances in data collection, HD2 data occurs in medicine (from neuroscience to patient care), finance and economics, geosciences (such as earthquake data), marine science (fishing and shipping), and, of course, in internet data. This research project focuses on how to extract information from HD2 data, and how to turn this data into knowledge. As part of the process, the project develops cutting-edge mathematics and statistical methodology to uncover the dependence structure governing HD2 data. In addition to developing a general theory, the project is concerned with applications to financial data, including risk management, forecasting, and portfolio management. More precise estimators, with improved margins of error, will be useful in all these areas of finance. The results will be of interest to main-street investors, regulators and policymakers, and the results will be entirely in the public domain. The project will also provide research training opportunities for students.
In more detail, the project will focus on four linked questions for HD2 data: contiguity, matrix decompositions, uncertainty quantification, and the estimation of spot quantities. The investigators will extend their contiguity theory to the common case where observations have noise, which also permits the use of longer local intervals. Under a contiguous probability, the structure of the observations is often more accessible (frequently Gaussian) in local neighborhoods, facilitating statistical analysis. This is achieved without altering the underlying models. Because the effect of the probability change is quite transparent, this approach also enables more direct uncertainty quantification. To model HD2 data, the investigators will explore time-varying matrix decompositions, including the development of a singular value decomposition (SVD) for high frequency data, as a more direct path to a factor model. Both SVD and principal component analysis (PCA) benefit from contiguity, which eases both the time-varying construction, and uncertainty quantification. The latter is of particular importance not only to set standard errors, but also to determine the trade-offs involved in estimation under longitudinal variation: for example, how many minutes or days are required to estimate a covariance matrix, or singular vectors? The investigators also plan to develop volatility matrices for the drift part of a financial process, and their PCAs. The work on matrix decompositions will also benefit from projected results on spot estimation, which also ties in with contiguity. It is expected that the consequences of the contiguity and the HD2 inference will be transformational, leading to more efficient estimators and better prediction, and that this approach will form a new paradigm for high frequency data.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413748","Collaborative Research: NSF MPS/DMS-EPSRC: Stochastic Shape Processes and Inference","DMS","STATISTICS","08/01/2024","06/20/2024","Anuj Srivastava","FL","Florida State University","Standard Grant","Yulia Gel","07/31/2027","$200,000.00","","anuj@stat.fsu.edu","874 TRADITIONS WAY","TALLAHASSEE","FL","323060001","8506445260","MPS","126900","1269, 7929","$0.00","The intimate link between form, or shape, and function is ubiquitous in science. In biology, for instance, the shapes of biological components are pivotal in understanding patterns of normal behavior and growth; a notable example is protein shape, which contributes to our understanding of protein function and classification. This project, led by a team of investigators from the USA and the UK, will develop ways of modeling how biological and other shapes change with time, using formal statistical frameworks that capture not only the changes themselves, but how these changes vary across objects and populations. This will enable the study of the link between form and function in all its variability. As example applications, the project will develop models for changes in cell morphology and topology during motility and division, and changes in human posture during various activities, facilitating the exploration of scientific questions such as how and why cell division fails, or how to improve human postures in factory tasks. These are proofs of concept, but the methods themselves will have much wider applicability. This project will thus not only progress the science of shape analysis and the specific applications studied; it will have broader downstream impacts on a range of scientific application domains, providing practitioners with general and useful tools.
While there are several approaches for representing and analyzing static shapes, encompassing curves, surfaces, and complex structures like trees and shape graphs, the statistical modeling and analysis of dynamic shapes has received limited attention. Mathematically, shapes are elements of quotient spaces of nonlinear manifolds, and shape changes can be modeled as stochastic processes, termed shape processes, on these complex spaces. The primary challenges lie in adapting classical modeling concepts to the nonlinear geometry of shape spaces and in developing efficient statistical tools for computation and inference in such very high-dimensional, nonlinear settings. The project consists of three thrust areas, dealing with combinations of discrete and continuous time, and discrete and continuous representations of shape, with a particular emphasis on the issues raised by topology changes. The key idea is to integrate spatiotemporal registration of objects and their evolution into the statistical formulation, rather than treating them as pre-processing steps. This project will specifically add to the current state-of-the-art in topic areas such as stochastic differential equations on shape manifolds, time series models for shapes, shape-based functional data analysis, and modeling and inference on infinite-dimensional shape spaces.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2412628","Collaborative Research: Partial Priors, Regularization, and Valid & Efficient Probabilistic Structure Learning","DMS","STATISTICS","07/01/2024","06/17/2024","Ryan Martin","NC","North Carolina State University","Standard Grant","Yulia Gel","06/30/2027","$160,000.00","","rgmarti3@ncsu.edu","2601 WOLF VILLAGE WAY","RALEIGH","NC","276950001","9195152444","MPS","126900","1269","$0.00","Modern applications of statistics aim to solve complex scientific problems involving high-dimensional unknowns. One feature that these applications often share is that the high-dimensional unknown is believed to satisfy a complexity-limiting, low-dimensional structure. Specifics of the posited low-dimensional structure are mostly unknown, so a statistically interesting and scientifically relevant problem is structure learning, i.e., using data to learn the latent low-dimensional structure. Because structure learning problems are ubiquitous and reliable uncertainty quantification is imperative, results from this project will have an impact across the biomedical, physical, and social sciences. In addition, the project will offer multiple opportunities for career development of new generations of statisticians and data scientists.
Frequentist methods focus on data-driven estimation or selection of a candidate structure, but currently there are no general strategies for reliable uncertainty quantification concerning the unknown structure. Bayesian methods produce a data-dependent probability distribution over the space of structures that can be used for uncertainty quantification, but it comes with no reliability guarantees. A barrier to progress in reliable uncertainty quantification is the oppositely extreme perspectives: frequentists' anathema of modeling structural/parametric uncertainty versus Bayesians' insistence that such uncertainty always be modeled precisely and probabilistically. Overcoming this barrier requires a new perspective falling between these two extremes, and this project will develop a new framework that features a more general and flexible perspective on probability, namely, imprecise probability. Most importantly, this framework will resolve the aforementioned issues by offering new and powerful methods boasting provably reliable uncertainty quantification in structure learning applications.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413864","Statistical Properties of Neural Networks","DMS","STATISTICS","07/01/2024","06/18/2024","Sourav Chatterjee","CA","Stanford University","Standard Grant","Tapabrata Maiti","06/30/2027","$225,000.00","","souravc@stanford.edu","450 JANE STANFORD WAY","STANFORD","CA","943052004","6507232300","MPS","126900","1269","$0.00","Neural networks have revolutionized science and engineering in recent years, but their theoretical properties are still poorly understood. The proposed projects aim to gain a deeper understanding of these theoretical properties, especially the statistical ones. It is a matter of intense debate whether neural networks can ""think"" like humans do, by recognizing logical patterns. The project aims to take a small step towards showing that under ideal conditions, perhaps they can. If successful, this will have impact in a vast range of applications of neural networks. This award includes support and mentoring for graduate students.
In one direction, it is proposed to study features of deep neural networks that distinguish them from classical statistical parametric models. Preliminary results suggest that the lack of identifiability is the differentiating factor. Secondly, it is proposed to investigate the extent to which neural networks may be seen as algorithm approximators, going beyond the classical literature on universal function approximation for neural networks. This perspective may shed light on recent empirical phenomena in neural networks, including the surprising emergent behavior of transformers and large language models.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413834","Collaborative Research: Nonparametric Learning in High-Dimensional Survival Analysis for causal inference and sequential decision making","DMS","STATISTICS","07/01/2024","06/18/2024","Zhezhen Jin","NY","Columbia University","Standard Grant","Jun Zhu","06/30/2027","$100,000.00","","zj7@columbia.edu","615 W 131ST ST","NEW YORK","NY","100277922","2128546851","MPS","126900","","$0.00","Data with survival outcomes are commonly encountered in real-world applications to capture the time duration until a specific event of interest occurs. Nonparametric learning for high dimensional survival data offers promising avenues in practice because of its ability to capture complex relationships and provide comprehensive insights for diverse problems in medical and business services, where vast covariates and individual metrics are prevalent. This project will significantly advance the methods and theory for nonparametric learning in high-dimensional survival data analysis, with a specific focus on causal inference and sequential decision making problems. The study will be of interest to practitioners in various fields, particularly providing useful methods for medical researchers to discover relevant risk factors, assess causal treatment effects, and utilize personalized treatment strategies in contemporary health sciences. It will also provide useful analytics tools beneficial to financial and related institutions for assessing user credit risks and facilitating informed decisions through personalized services. The theoretical and empirical studies to incorporate complex nonparametric structures in high-dimensional survival analysis, together with their interdisciplinary applications, will create valuable training and research opportunities for graduate and undergraduate students, including those from underrepresented minority groups.
Under flexible nonparametric learning frameworks, new embedding methods and learning algorithms will be developed for high dimensional survival analysis. First, the investigators will develop supervised doubly robust linear embedding and supervised nonlinear manifold learning method for supervised dimension reduction of high dimensional survival data, without imposing stringent model or distributional assumptions. Second, a robust nonparametric learning framework will be established for estimating causal treatment effect for high dimensional survival data that allows the covariate dimension to grow much faster than the sample size. Third, motivated by applications in personalized service, the investigators will develop a new nonparametric multi-stage algorithm for high dimensional censored bandit problems that allows flexibility with potential non-linear decision boundaries with optimal regret guarantees.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413405","Collaborative Research: Statistical Optimal Transport: Foundation, Computation and Applications","DMS","STATISTICS","07/01/2024","06/18/2024","Kengo Kato","NY","Cornell University","Standard Grant","Yong Zeng","06/30/2027","$160,000.00","","kk976@cornell.edu","341 PINE TREE RD","ITHACA","NY","148502820","6072555014","MPS","126900","079Z","$0.00","Comparing probability models is a fundamental task in almost every data-enabled problem, and Optimal Transport (OT) offers a powerful and versatile framework to do so. Recent years have witnessed a rapid development of computational OT, which has expanded applications of OT to statistics, including clustering, generative modeling, domain adaptation, distribution-to-distribution regression, dimension reduction, and sampling. Still, understanding the fundamental strengths and limitations of OT as a statistical tool is much to be desired. This research project aims to fill this important gap by advancing statistical analysis (estimation and inference) and practical approximation of two fundamental notions (average and quantiles) in statistics and machine learning, demonstrated through modern applications for measure-valued data. The project also provides research training opportunities for graduate students.
The award contains three main research projects. The first project will develop a new regularized formulation of the Wasserstein barycenter based on the multi-marginal OT and conduct an in-depth statistical analysis, encompassing sample complexity, limiting distributions, and bootstrap consistency. The second project will establish asymptotic distribution and bootstrap consistency results for linear functionals of OT maps and will study sharp asymptotics for entropically regularized OT maps when regularization parameters tend to zero. Building on the first two projects, the third project explores applications of the OT methodology to two important statistical tasks: dimension reduction and vector quantile regression. The research agenda will develop a novel and computationally efficient principal component method for measure-valued data and a statistically valid duality-based estimator for quantile regression with multivariate responses. The three projects will produce novel technical tools integrated from OT theory, empirical process theory, and partial differential equations, which are essential for OT-based inferential methods and will inspire new applications of OT to measure-valued and multivariate data.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
@@ -60,9 +61,9 @@
"2413823","Robust and efficient Bayesian inference for misspecified and underspecified models","DMS","STATISTICS","07/01/2024","06/18/2024","Steven MacEachern","OH","Ohio State University","Standard Grant","Tapabrata Maiti","06/30/2027","$300,000.00","Ju Hee Lee, Hang Joon Kim","snm@stat.osu.edu","1960 KENNY RD","COLUMBUS","OH","432101016","6146888735","MPS","126900","","$0.00","This research project aims to improve data-driven modelling and decision-making. Its focus is on the development of Bayesian methods for low-information settings. Bayesian methods have proven to be tremendously successful in high-information settings where data is of high-quality, the scientific/business background that has generated the data is well-understood, and clear questions are asked. This project will develop a suite of Bayesian methods designed for low-information settings, including those where (i) the data show particular types of deficiencies, such as a preponderance of outlying or ?bad data?, (ii) a limited conceptual understanding of the phenomenon under study leads to a model that leaves a substantial gap between model and reality, producing a misspecified model or a model that is not fully specified, and (iii) when there is a shortage of data, so that the model captures only a very simplified version of reality. The new methods will expand the scope of Bayesian applications, with attention to problems in biomedical applications and psychology. The project will provide training for the next generation of data scientists.
This project has two main threads. For the first, the project will develop diagnostics that allow the analyst to assess the adequacy of portions of a posited model. Such assessments point the way toward elaborations that will bring the model closer to reality, improving the full collection of inferences. These assessments will also highlight limitations of the model, enabling the analyst to know when to make a decision and when to refrain from making one. The second thread will explore the use of sample-size adaptive loss functions for modelling and for inference. Adaptive loss functions have been used by classical statisticians to improve inference by exploiting the bias-variance tradeoff. This thread will blend adaptivity with Bayesian methods. This will robustify inference by providing smoother likelihoods for small and moderate sample sizes and by relying on smoother inference functions when the sample size is limited.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413484","New Approaches to Sensitivity Analysis in Observational Studies","DMS","STATISTICS","09/01/2024","06/17/2024","Colin Fogarty","MI","Regents of the University of Michigan - Ann Arbor","Continuing Grant","Yong Zeng","08/31/2027","$58,516.00","","fogartyc@umich.edu","1109 GEDDES AVE, SUITE 3300","ANN ARBOR","MI","481091079","7347636438","MPS","126900","075Z, 079Z","$0.00","While randomized experiments remain the gold standard for elucidating cause and effect relations, countless societally important ""what-if?"" questions cannot be addressed through clinical trials for a litany of reasons, ranging from ethical concerns to logistical infeasibility. For this reason, observational studies, wherein the assignment of group status to individuals is outside the control of the researcher, often represent the only path forward for inferring causal effects. While observational data are often inexpensive to collect and plentiful, regrettably, they suffer from inescapable biases due to self-selection. In short, associations between group status and outcomes of interest need not reflect causal effects, as the groups being compared might have considerable differences on the basis of factors unavailable for adjustment. This project will develop new methods for sensitivity analysis in observational studies, which answer the question, ""How much-unmeasured confounding would need to exist to overturn a study's finding of a causal effect?"" Quantifying the robustness of observational findings to hidden bias will help frame the debate around the reliability of such studies, allowing researchers to highlight findings that are particularly resilient to lurking variables. This project provides both theoretical guidance on how to extract the most out of a sensitivity analysis and computationally tractable methods for making this guidance actionable. Moreover, when randomized experimentation is possible, the developed methods will help researchers use existing observational studies for hypothesis generation, enabling them to find sets of promising outcome variables whose causal effects may be verified through follow-up experimentation. This award includes support for work with graduate students.
This project develops a new set of statistical methods for conducting sensitivity analyses after matching. These methods aim to overcome shortcomings of the existing approach, conferring computational, theoretical, and practical benefits. The project will provide a new approach to sensitivity analysis after matching called weighting-after-matching. The project will establish computational benefits, theoretical improvements in design sensitivity, and practical improvements in the power of a sensitivity analysis by using weighting-after-matching in lieu of the traditional unweighted approach. The project will also establish novel methods for sensitivity analysis with multiple outcome variables. These innovations will include a scalable multiple testing procedure for observational studies, facilitating exploratory analysis while providing control of the proportion of false discoveries, and methods for sensitivity analysis using weighting-after-matching for testing both sharp null hypotheses of no effect at all and hypotheses on average treatment effects. Finally, the project will establish previously unexplored benefits from using matching and weighting in combination, two modes of adjustment in observational studies commonly viewed as competitors. This will help bridge the divide between matching estimators and weighting estimators in the context of a sensitivity, in so doing providing a natural avenue for theoretical comparisons of these approaches.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2412052","Tackling High Dimensionality for Modern Machine Learning: Theory and Visualization","DMS","STATISTICS","07/01/2024","06/17/2024","Yiqiao Zhong","WI","University of Wisconsin-Madison","Continuing Grant","Tapabrata Maiti","06/30/2027","$67,291.00","","yiqiao.zhong@wisc.edu","21 N PARK ST STE 6301","MADISON","WI","537151218","6082623822","MPS","126900","","$0.00","This research project aims to address the recent challenges of modern machine learning from a statistical perspective. Deep Learning and particularly Large Language Models have the potential to transform our society, yet their scientific underpinning is much less developed. In particular, large-scale black-box models are deployed in applications with little understanding about when they may or may not work as expected. The research is expected to advance the understanding of modern machine learning. It will also provide accessible tools to improve the interpretations and safety of models. This award will involve and support graduate students.
The project is motivated by recent statistical phenomena such as double descent and benign overfitting that involve training a model with many parameters. Motivated by the empirical discoveries in Deep Learning, the project will develop insights into overfitting in imbalanced classification in high dimensions and the effects of reparametrization in contrastive learning. Understanding the generalization errors under overparametrization in practical scenarios, such as imbalanced classification, will likely lead to better practice of reducing overfitting. This project will also explore interpretations for black-box models and complicated methods: (1) in Transformers, high-dimensional embedding vectors are decomposed into interpretable components; (2) in t-SNE, embedding points are assessed by metrics related to map discontinuity. By using classical ideas from factor analysis and leave-one-out, this project will result in new visualization tools for interpretations and diagnosis.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2412629","Collaborative Research: Partial Priors, Regularization, and Valid & Efficient Probabilistic Structure Learning","DMS","STATISTICS","07/01/2024","06/17/2024","Chuanhai Liu","IN","Purdue University","Standard Grant","Yulia Gel","06/30/2027","$160,000.00","","chuanhai@purdue.edu","2550 NORTHWESTERN AVE # 1100","WEST LAFAYETTE","IN","479061332","7654941055","MPS","126900","","$0.00","Modern applications of statistics aim to solve complex scientific problems involving high-dimensional unknowns. One feature that these applications often share is that the high-dimensional unknown is believed to satisfy a complexity-limiting, low-dimensional structure. Specifics of the posited low-dimensional structure are mostly unknown, so a statistically interesting and scientifically relevant problem is structure learning, i.e., using data to learn the latent low-dimensional structure. Because structure learning problems are ubiquitous and reliable uncertainty quantification is imperative, results from this project will have an impact across the biomedical, physical, and social sciences. In addition, the project will offer multiple opportunities for career development of new generations of statisticians and data scientists.
Frequentist methods focus on data-driven estimation or selection of a candidate structure, but currently there are no general strategies for reliable uncertainty quantification concerning the unknown structure. Bayesian methods produce a data-dependent probability distribution over the space of structures that can be used for uncertainty quantification, but it comes with no reliability guarantees. A barrier to progress in reliable uncertainty quantification is the oppositely extreme perspectives: frequentists' anathema of modeling structural/parametric uncertainty versus Bayesians' insistence that such uncertainty always be modeled precisely and probabilistically. Overcoming this barrier requires a new perspective falling between these two extremes, and this project will develop a new framework that features a more general and flexible perspective on probability, namely, imprecise probability. Most importantly, this framework will resolve the aforementioned issues by offering new and powerful methods boasting provably reliable uncertainty quantification in structure learning applications.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413891","Nonparametric estimation in causal inference: optimality in traditional models and newer ones","DMS","STATISTICS","08/01/2024","06/14/2024","Matteo Bonvini","NJ","Rutgers University New Brunswick","Continuing Grant","Yong Zeng","07/31/2027","$59,393.00","","mb1662@stat.rutgers.edu","3 RUTGERS PLZ","NEW BRUNSWICK","NJ","089018559","8489320150","MPS","126900","075Z","$0.00","This project provides new methods for estimating causal effects from non-randomized studies. Quantifying the causal effect of a variable on another one is of fundamental importance in science because it allows for the understanding of what happens if a certain action is taken, e.g., if a drug is prescribed to a patient. When randomized experiments are not feasible, e.g., because of costs or ethical concerns, quantifying the effect of a treatment on an outcome can be very challenging. Roughly, this is because the analysis must ensure that the treated and untreated units are ?comparable,? a condition implied by proper randomization. In these settings, the analyst typically proceeds in two steps: 1) they introduce the key assumptions needed to identify the causal effect, and 2) they specify a model for the distribution of the data, often nonparametric, to accommodate modern, complex datasets, as well as the appropriate estimation strategy. One key difficulty in non-randomized studies is that estimating causal effects typically requires estimating nuisance components of the data distribution that are not of direct interest and that can be potentially quite hard to estimate. Focused on the second part of the analysis, this project aims to design optimal methods for estimating causal effects in different settings. Informally, an optimal estimator converges to the true causal effect ?as quickly as possible? as a function of the sample size and thus leads to the most precise inferences. Establishing optimality has thus two fundamental benefits: 1) it leads to procedures that make the most efficient use of the available data, and 2) it serves as a benchmark against which future methods can be evaluated. In this respect, the theoretical and methodological contributions of this project are expected to lead to substantial improvements in the analysis of data from many domains, such as medicine and the social sciences. The project also aims to offer opportunities for training and mentoring graduate and undergraduate students.
For certain estimands and data structures, the principles of semiparametric efficiency theory can be used to derive optimal estimators. However, they are not directly applicable to causal parameters that are ?non-smooth? or for which the nuisance parts of the data distribution can only be estimated at such slow rates that root-n convergence of the causal effect estimator is not attainable. As part of this project, the Principal Investigator aims to study the optimal estimation of prominent examples of non-smooth parameters, such as causal effects defined by continuous treatments. Furthermore, this project will consider optimal estimation of ?smooth? parameters, such as certain average causal effects, in newer nonparametric models for which relatively fast rates of convergence are possible, even if certain components of the data distribution can only be estimated at very slow rates. In doing so, the project aims to propose new techniques for reducing the detrimental effect of the nuisance estimators? bias on the quality of the causal effect estimator. It also aims to design and implement inferential procedures for the challenging settings considered, thereby enhancing the adoption of the methods proposed in practice.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413557","Collaborative Research: Systemic Shock Inference for High-Frequency Data","DMS","STATISTICS","07/01/2024","06/14/2024","Jose Figueroa-Lopez","MO","Washington University","Standard Grant","Jun Zhu","06/30/2027","$99,957.00","","figueroa@math.wustl.edu","ONE BROOKINGS DR","SAINT LOUIS","MO","63110","3147474134","MPS","126900","","$0.00","Unexpected ?shocks,? or abrupt deviations from periods of stability naturally occur in time-dependent data-generating mechanisms across a variety of disciplines. Examples include crashes in stock markets, flurries of activity on social media following news events, and changes in animal migratory patterns during global weather events, among countless others. Reliable detection and statistical analysis of shock events is crucial in applications, as shock inference can provide scientists deeper understanding of large systems of time-dependent variables, helping to mitigate risk and manage uncertainty. When large systems of time-dependent variables are observed at high sampling frequencies, information at fine timescales can reveal hidden connections and provide insights into the collective uncertainty shared by an entire system. High-frequency observations of such systems appear in econometrics, climatology, statistical physics, and many other areas of empirical science that can benefit from reliable inference of shock events. This project will develop new statistical techniques for the both the detection and analysis of shocks in large systems of time-dependent variables observed at high temporal sampling frequencies. The project will also involve mentoring students, organizing workshops, and promoting diversity in STEM.
The investigators will study shock inference problems in a variety of settings in high dimensions. Special focus will be paid to semi-parametric high-frequency models that display a factor structure. Detection based on time-localized principal component analysis and related techniques will be explored, with a goal towards accounting for shock events that impact a large number of component series in a possibly asynchronous manner. Time-localized bootstrapping methods will also be considered for feasible testing frameworks for quantifying the system-level impact of shocks. Complimentary lines of inquiry will concern estimation of jump behavior in high-frequency models in multivariate contexts and time-localized clustering methods.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
-"2412629","Collaborative Research: Partial Priors, Regularization, and Valid & Efficient Probabilistic Structure Learning","DMS","STATISTICS","07/01/2024","06/17/2024","Chuanhai Liu","IN","Purdue University","Standard Grant","Yulia Gel","06/30/2027","$160,000.00","","chuanhai@purdue.edu","2550 NORTHWESTERN AVE # 1100","WEST LAFAYETTE","IN","479061332","7654941055","MPS","126900","","$0.00","Modern applications of statistics aim to solve complex scientific problems involving high-dimensional unknowns. One feature that these applications often share is that the high-dimensional unknown is believed to satisfy a complexity-limiting, low-dimensional structure. Specifics of the posited low-dimensional structure are mostly unknown, so a statistically interesting and scientifically relevant problem is structure learning, i.e., using data to learn the latent low-dimensional structure. Because structure learning problems are ubiquitous and reliable uncertainty quantification is imperative, results from this project will have an impact across the biomedical, physical, and social sciences. In addition, the project will offer multiple opportunities for career development of new generations of statisticians and data scientists.
Frequentist methods focus on data-driven estimation or selection of a candidate structure, but currently there are no general strategies for reliable uncertainty quantification concerning the unknown structure. Bayesian methods produce a data-dependent probability distribution over the space of structures that can be used for uncertainty quantification, but it comes with no reliability guarantees. A barrier to progress in reliable uncertainty quantification is the oppositely extreme perspectives: frequentists' anathema of modeling structural/parametric uncertainty versus Bayesians' insistence that such uncertainty always be modeled precisely and probabilistically. Overcoming this barrier requires a new perspective falling between these two extremes, and this project will develop a new framework that features a more general and flexible perspective on probability, namely, imprecise probability. Most importantly, this framework will resolve the aforementioned issues by offering new and powerful methods boasting provably reliable uncertainty quantification in structure learning applications.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413425","Collaborative Research: Synergies between Steins Identities and Reproducing Kernels: Modern Tools for Nonparametric Statistics","DMS","STATISTICS","07/01/2024","06/17/2024","Bharath Sriperumbudur","PA","Pennsylvania State Univ University Park","Standard Grant","Yong Zeng","06/30/2027","$179,999.00","","bks18@psu.edu","201 OLD MAIN","UNIVERSITY PARK","PA","168021503","8148651372","MPS","126900","079Z","$0.00","The project aims to conduct comprehensive statistical and computational analyses, with the overarching objective of advancing innovative nonparametric data analysis techniques. The methodologies and theories developed are anticipated to push the boundaries of modern nonparametric statistical inference and find applicability in other statistical domains such as nonparametric latent variable models, time series analysis, and sequential nonparametric multiple testing. This project will enhance the interconnections among statistics, machine learning, and computation and provide training opportunities for postdoctoral fellows, graduate students, and undergraduates.
More specifically, the project covers key problems in nonparametric hypothesis testing, intending to establish a robust framework for goodness-of-fit testing for distributions on non-Euclidean domains with unknown normalization constants. The research also delves into nonparametric variational inference, aiming to create a particle-based algorithmic framework with discrete-time guarantees. Furthermore, the project focuses on nonparametric functional regression, with an emphasis on designing minimax optimal estimators using infinite-dimensional Stein's identities. The study also examines the trade-offs between statistics and computation in all the aforementioned methods. The common thread weaving through these endeavors is the synergy between various versions of Stein's identities and reproducing kernels, contributing substantially to the advancement of models, methods, and theories in contemporary nonparametric statistics.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2412833","Collaborative Research: Statistical Modeling and Inference for Object-valued Time Series","DMS","STATISTICS","07/01/2024","06/17/2024","Xiaofeng Shao","IL","University of Illinois at Urbana-Champaign","Standard Grant","Jun Zhu","06/30/2027","$174,997.00","","xshao@illinois.edu","506 S WRIGHT ST","URBANA","IL","618013620","2173332187","MPS","126900","","$0.00","Random objects in general metric spaces have become increasingly common in many fields. For example, the intraday return path of a financial asset, the age-at-death distributions, the annual composition of energy sources, social networks, phylogenetic trees, and EEG scans or MRI fiber tracts of patients can all be viewed as random objects in certain metric spaces. For many endeavors in this area, the data being analyzed is collected with a natural ordering, i.e., the data can be viewed as an object-valued time series. Despite its prevalence in many applied problems, statistical analysis for such time series is still in its early development. A fundamental difficulty of developing statistical techniques is that the spaces where these objects live are nonlinear and commonly used algebraic operations are not applicable. This research project aims to develop new models, methodology and theory for the analysis of object-valued time series. Research results from the project will be disseminated to the relevant scientific communities via publications, conference and seminar presentations. The investigators will jointly mentor a Ph.D. student and involve undergraduate students in the research, as well as offering advanced topic courses to introduce the state-of-the-art techniques in object-valued time series analysis.
The project will develop a systematic body of methods and theory on modeling and inference for object-valued time series. Specifically, the investigators propose to (1) develop a new autoregressive model for distributional time series in Wasserstein geometry and a suite of tools for model estimation, selection and diagnostic checking; (2) develop new specification testing procedures for distributional time series in the one-dimensional Euclidean space; and (3) develop new change-point detection methods to detect distribution shifts in a sequence of object-valued time series. The above three projects tackle several important modeling and inference issues in the analysis of object-valued time series, the investigation of which will lead to innovative methodological and theoretical developments, and lay groundwork for this emerging field.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2413558","Collaborative Research: Systemic Shock Inference for High-Frequency Data","DMS","STATISTICS","07/01/2024","06/14/2024","Benjamin Boniece","PA","Drexel University","Continuing Grant","Jun Zhu","06/30/2027","$26,626.00","","cooper.boniece@drexel.edu","3141 CHESTNUT ST","PHILADELPHIA","PA","191042875","2158956342","MPS","126900","","$0.00","Unexpected ?shocks,? or abrupt deviations from periods of stability naturally occur in time-dependent data-generating mechanisms across a variety of disciplines. Examples include crashes in stock markets, flurries of activity on social media following news events, and changes in animal migratory patterns during global weather events, among countless others. Reliable detection and statistical analysis of shock events is crucial in applications, as shock inference can provide scientists deeper understanding of large systems of time-dependent variables, helping to mitigate risk and manage uncertainty. When large systems of time-dependent variables are observed at high sampling frequencies, information at fine timescales can reveal hidden connections and provide insights into the collective uncertainty shared by an entire system. High-frequency observations of such systems appear in econometrics, climatology, statistical physics, and many other areas of empirical science that can benefit from reliable inference of shock events. This project will develop new statistical techniques for the both the detection and analysis of shocks in large systems of time-dependent variables observed at high temporal sampling frequencies. The project will also involve mentoring students, organizing workshops, and promoting diversity in STEM.
The investigators will study shock inference problems in a variety of settings in high dimensions. Special focus will be paid to semi-parametric high-frequency models that display a factor structure. Detection based on time-localized principal component analysis and related techniques will be explored, with a goal towards accounting for shock events that impact a large number of component series in a possibly asynchronous manner. Time-localized bootstrapping methods will also be considered for feasible testing frameworks for quantifying the system-level impact of shocks. Complimentary lines of inquiry will concern estimation of jump behavior in high-frequency models in multivariate contexts and time-localized clustering methods.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
@@ -90,9 +91,9 @@
"2342821","Conference: Emerging Statistical and Quantitative Issues in Genomic Research in Health Sciences","DMS","STATISTICS","02/01/2024","01/24/2024","Xihong Lin","MA","Harvard University","Standard Grant","Tapabrata Maiti","01/31/2027","$61,920.00","","xlin@hsph.harvard.edu","1033 MASSACHUSETTS AVE STE 3","CAMBRIDGE","MA","021385366","6174955501","MPS","126900","7556","$0.00","The 2023 Conference of the Program in Quantitative Genomics (PQG), entitled ?Diversity in Genetics and Genomics? will take place at the Joseph B. Martin Conference Center at the Harvard Medical School on October 17-18, 2023. This long-standing Harvard T. H. Chan School Public Health Program in Quantitative Genomics Conference series focuses on timely interdisciplinary discussions on emerging statistical and computational challenges in genetic and genomic science. The focus of each conference evolves in parallel to scientific frontiers. A key feature of the series is its interdisciplinary nature, where quantitative and subject-matter scientists jointly discuss statistical and quantitative issues that arise in cutting-edge genetic and genomic research in human diseases. Conference participants critique existing quantitative methods, discuss emerging statistical and quantitative issues, identify priorities for future research, and disseminate results. Diversity in genetic and genomics has been increasingly embraced not only for enabling more powerful studies but also because of the need to avoid further exacerbation of structured inequalities in healthcare systems and to chart a path forward for their amelioration. Significant effort has been made in recent years to improve study participant and workforce diversity in genetics and genomics, including the inclusion of diverse groups in discovery and functional studies and translational efforts to empower or pave the road for equitable clinical impact. The 2023 conference will provide a platform to engage inter-disciplinary researchers to have in-depth discussions on the quantitative challenges and opportunities in increasing diversity in genetic and genomic research. We will make serious efforts to recruit junior researchers, including graduate students, postdoctoral fellows, in particular underrepresented minorities and women, as speakers and participants.
The impetus for the 2023 conference theme comes from the pressing need to address the statistical and quantitative issues in diversity in genetic and genomic research. The three topics of the conference include (1) Diversity for gene mapping and studying variant functions; (2) Diversity for translational genetics: polygenic risk and clinical implementation; (3) How do we move forward while acknowledging the past? Examples of the first topic include multi-ancestry genetic association tests, fine-mapping, and eQTL analysis. Examples of the second topic include trans-ethnic polygenic risk prediction and transferred learning. Examples of the third topic include enhancing transparency in the use of population descriptors in genomics research and building global collaborative genetic research frameworks. The education and research activities discussed at the conference will make important contributions to advance efforts on increasing diversity of genetic and genomic research, and will help create the scientific basis and workforce required to ensure and sustain US competitiveness both economically and technologically, prolonging and saving lives, and promoting national security. For more information, see www.hsph.harvard.edu/pqg-conference/.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2403813","Conference: Theory and Foundations of Statistics in the Era of Big Data","DMS","STATISTICS","02/01/2024","02/01/2024","Xin Zhang","FL","Florida State University","Standard Grant","Tapabrata Maiti","01/31/2025","$14,800.00","Srijan Sengupta","henry@stat.fsu.edu","874 TRADITIONS WAY","TALLAHASSEE","FL","323060001","8506445260","MPS","126900","7556","$0.00","The Department of Statistics at Florida State University (FSU) will host a three-day conference titled ""Theory and Foundations of Statistics in the Era of Big Data"" in Tallahassee, Florida, from April 19 to 21, 2024. The main objective of the conference is to bring together a global community of statisticians and data scientists to chart the state-of-the-art, challenges, and the future trajectory of contemporary statistical foundations, theory, and practice. The format of the conference includes three plenary sessions, six invited sessions showcasing current senior leaders in the field who have made foundational contributions to statistics, two special invited sessions for early-career researchers, a poster session for graduate students, and a banquet talk by a leading expert. The special invited sessions and poster session will provide a unique opportunity for early-career researchers and graduate students not only to showcase their research work but also to benefit from in-depth intellectual interactions with leaders in the field in a small conference setting.
The main objective of the conference is to bring together present-day statistics and science innovators and senior leaders with emerging young researchers to identify, discuss, and decipher solutions to these foundational issues and challenges faced by modern-day statistics and data science. Providing support for junior researchers and graduate students who do not have access to other sources of funding to attend this important and timely gathering of researchers working on the foundational aspects of statistical sciences is also key to maintaining the current leadership of U.S. institutions in this field. It is extremely timely to have such an event to stimulate and comprehend the major contemporary challenges in the foundation, theory, and implementation of the field that is currently playing such an important role in every sphere of social media, economic security, public health, and beyond. This conference will be in partnership with the International Indian Statistical Association (IISA) and co-sponsored by the American Statistical Association (ASA), the National Institute of Statistical Science (NISS), and the Institute of Mathematical Statistics (IMS). The conference website is https://sites.google.com/view/theory-and-foundations-of-stat/
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2349991","Conference: Statistics in the Age of AI","DMS","STATISTICS","03/01/2024","12/18/2023","Xiaoke Zhang","DC","George Washington University","Standard Grant","Tapabrata Maiti","02/28/2025","$18,400.00","","xkzhang@gwu.edu","1918 F ST NW","WASHINGTON","DC","200520042","2029940728","MPS","126900","7556","$0.00","The conference ?Statistics in the Age of AI? will be held at George Washington University, Washington, DC on May 9-11, 2024. With the boom of artificial intelligence (AI), partly accelerated by the launching of large language models (e.g., ChatGPT), AI tools have reached every corner of our society. In this era where many business and scientific problems are being tackled via AI systems, statistics has become more critical than ever since it can offer uncertainty quantification, causal analysis, and interpretability among others, which most AI systems are lacking. This conference will bring together researchers and practitioners in academics and industries to explore the impact of AI on statistical research, education, and practice and also to brainstorm how statistics can contribute to AI. The conference organizers encourage participation and attendance by students, post-doctoral scholars, early-career researchers, and individuals from underrepresented groups.
The conference features short courses, poster and oral presentations, and panel discussions. The two short courses will focus on causal inference and conformal inference respectively. The presentations and panel discussions will address efficient handling of data for AI models and architectures, uncertainty quantification, and responsible decision-making among other topics. Further information will become available on the conference website: https://statistics.columbian.gwu.edu/statistics-age-ai.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
+"2340241","CAREER: New Frameworks for Ethical Statistical Learning: Algorithmic Fairness and Privacy","DMS","STATISTICS","07/01/2024","01/23/2024","Linjun Zhang","NJ","Rutgers University New Brunswick","Continuing Grant","Yong Zeng","06/30/2029","$90,127.00","","linjun.zhang@rutgers.edu","3 RUTGERS PLZ","NEW BRUNSWICK","NJ","089018559","8489320150","MPS","126900","1045","$0.00","With the unprecedented impact of data science and machine learning in many aspects of our daily lives, such as healthcare, finance, education, and law, there is an urgent need to design ethical statistical learning algorithms that account for fairness and privacy. This project tackles the challenge of integrating ethical principles into the fabric of statistical learning. The approach prioritizes fairness by enhancing statistical algorithms to perform equitably, particularly in scenarios with limited sample sizes and where sensitive attributes are restricted by legal or societal norms. In parallel, this project addresses privacy by developing a general framework for studying the privacy-accuracy trade-off under new privacy constraints emerging with the advances in generative AI. The practical upshot of this work is the application of these methods to biomedical fields, accompanied by the release of open-source software, broadening the impact and encouraging ethical practices in statistical learning across various domains. This project promotes equitable and private data handling and provides research training opportunities to students.
The research objective of this project is to develop rigorous statistical frameworks for ethical machine learning, with a focus on algorithmic fairness and data privacy. More specifically, the project will: (1) develop innovative statistical methods that ensure fairness in a finite-sample and distribution-free manner; (2) design algorithms that ensure fairness while complying with societal and legal constraints on sensitive data; (3) establish new frameworks to elucidate the trade-off between statistical accuracy and new privacy concepts in generative AI, including machine unlearning and copyright protection. Taken together, the outcome of this research will build a firm foundation of ethical statistical learning and shed light on the development of new theoretical understanding and practical methodology with algorithmic fairness and privacy guarantees.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2338760","CAREER: Statistical Inference in Observational Studies -- Theory, Methods, and Beyond","DMS","STATISTICS","07/01/2024","01/10/2024","Rajarshi Mukherjee","MA","Harvard University","Continuing Grant","Jun Zhu","06/30/2029","$81,885.00","","rmukherj@hsph.harvard.edu","1033 MASSACHUSETTS AVE STE 3","CAMBRIDGE","MA","021385366","6174955501","MPS","126900","1045","$0.00","Causal inference refers to a systematic way of deciphering causal relationships between entities from empirical observations ? an epistemic framework that underlies past, present, and future scientific and social development. For designing statistical methods for causal inference, the gold standard pertains to randomized clinical trials where the researcher assigns treatment/exposure to subjects under study based on pure chance mechanisms. The random assignment negates systematic bias between the observed relationship between the treatment/exposure and outcome due to unknown common factors referred to as confounders. However, randomized clinical trials are often infeasible, expensive, and ethically challenging. In contrast, modern technological advancement has paved the way for the collection of massive amounts of data across a spectrum of possibilities such as health outcomes, environmental pollution, medical claims, educational policy interventions, and genetic mutations among many others. Since accounting for confounders in such data is the fundamental aspect of conducting valid causal inference, one of the major foci of modern causal inference research have been to design procedures to account for complex confounding structures without pre-specifying unrealistic statistical models. Despite the existence of a large canvas of methods in this discourse, the complete picture of the best statistical methods for inferring the causal effect of an exposure on an outcome while adjusting for arbitrary confounders remains largely open. Moreover, there are several popularly used methods that require rigorous theoretical justification and subsequent modification for reproducible statistical research in the domain of causal inference. This project is motivated by addressing these gaps and will be divided into two broad interconnected themes. In the first part, this project provides the first rigorous theoretical lens to the most popular method of confounder adjustment in large-scale genetic studies to find causal variants of diseases. This will in turn bring forth deeper questions about optimal statistical causal inference procedures that will be explored in the second part of the project. Since the project is designed to connect ideas from across statistical methods, probability theory, computer science, and machine learning, it will provide unique learning opportunities to design new courses and discourses. The project will therefore integrate research with education through course development, research mentoring for undergraduate and graduate students, especially those from underrepresented groups, and summer programs.
This project will focus on two broad and interrelated themes tied together by the motivation of conducting statistical and causal inference with modern observational data. The first part of the project involves providing the first detailed theoretical picture of the most popular principal component-based method of population stratification adjustment in genome-wide association studies. This part of the project also aims to provide new methodologies to correct for existing and previously unknown possible biases in the existing methodology as well as guidelines for practitioners for choosing between methods and design of studies. By recognizing the fundamental tenet of large-scale genetic data analysis as the identification of causal genetic determinants of disease phenotypes, the second part of the project develops the first complete picture of optimal statistical inference of causal effects in both high-dimensional under sparsity and nonparametric models under smoothness conditions. Moreover, this part of the project responds to the fundamental question of tuning learning algorithms for estimating nuisance functions, such as outcome regression and propensity score for causal effect estimation, to optimize the downstream mean-squared error of causal effect estimates instead of prediction errors associated with these regression functions. The overall research will connect ideas from high-dimensional statistical inference, random matrix theory, higher-order semiparametric methods, and information theory.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2338018","CAREER: Single-Fidelity vs. Multi-Fidelity Computer Experiments: Unveiling the Effectiveness of Multi-Fidelity Emulation","DMS","STATISTICS","06/01/2024","12/05/2023","Chih-Li Sung","MI","Michigan State University","Continuing Grant","Jun Zhu","05/31/2029","$79,437.00","","sungchih@msu.edu","426 AUDITORIUM RD RM 2","EAST LANSING","MI","488242600","5173555040","MPS","126900","1045","$0.00","Computer models have become indispensable tools across diverse fields, enabling the simulation of complex phenomena and facilitating decision-making without costly real-world experiments. Traditionally, computer models are simulated using single, high-accuracy simulations, employing a high level of detail and resolution throughout. Recent advancements, however, have shifted attention towards multi-fidelity simulations, balancing computational cost and accuracy by leveraging various levels of detail and resolution in the simulation. A key question arises: is it more effective to use single-fidelity or multi-fidelity simulations? This is a question practitioners often confront when conducting computer simulations. The research aims to address this fundamental question directly, providing valuable insights for practical decision-making. By leveraging insights gained from computational cost comparisons, the research will enhance the ability to predict complex scientific phenomena accurately and has the potential to revolutionize fields such as engineering, medical science, and biology. The project contributes to outreach and diversity efforts, inspiring youth and increasing female representation in STEM research. Moreover, collaborations with diverse research groups, as well as involvement in the REU exchange program, provide opportunities to engage undergraduate students, nurturing their interest in research and encouraging them to pursue careers in STEM. Research findings will be disseminated through publications and conferences. The code developed will be shared to foster collaboration and encourage others to build upon these innovative methodologies.
This research addresses the fundamental question of whether to conduct single-fidelity or multi-fidelity computer experiments by investigating the effectiveness of multi-fidelity simulations. It begins by examining the computational cost comparison between the two approaches, finding that multi-fidelity simulations, under certain conditions, can theoretically require more computational resources while achieving the same predictive ability. To mitigate the negative effects of low-fidelity simulations, a novel and flexible statistical emulator, called the Recursive Nonadditive (RNA) emulator, is proposed to leverage multi-fidelity simulations, and a sequential design scheme based on this emulator is developed, which maximizes the effectiveness by selecting inputs and fidelity levels based on a criterion that balances uncertainty reduction and computational cost. Furthermore, two novel multi-fidelity emulators, called ""secure emulators,"" are developed, which theoretically guarantee superior predictive performance compared to single-fidelity emulators, regardless of design choices.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
-"2340241","CAREER: New Frameworks for Ethical Statistical Learning: Algorithmic Fairness and Privacy","DMS","STATISTICS","07/01/2024","01/23/2024","Linjun Zhang","NJ","Rutgers University New Brunswick","Continuing Grant","Yong Zeng","06/30/2029","$90,127.00","","linjun.zhang@rutgers.edu","3 RUTGERS PLZ","NEW BRUNSWICK","NJ","089018559","8489320150","MPS","126900","1045","$0.00","With the unprecedented impact of data science and machine learning in many aspects of our daily lives, such as healthcare, finance, education, and law, there is an urgent need to design ethical statistical learning algorithms that account for fairness and privacy. This project tackles the challenge of integrating ethical principles into the fabric of statistical learning. The approach prioritizes fairness by enhancing statistical algorithms to perform equitably, particularly in scenarios with limited sample sizes and where sensitive attributes are restricted by legal or societal norms. In parallel, this project addresses privacy by developing a general framework for studying the privacy-accuracy trade-off under new privacy constraints emerging with the advances in generative AI. The practical upshot of this work is the application of these methods to biomedical fields, accompanied by the release of open-source software, broadening the impact and encouraging ethical practices in statistical learning across various domains. This project promotes equitable and private data handling and provides research training opportunities to students.
The research objective of this project is to develop rigorous statistical frameworks for ethical machine learning, with a focus on algorithmic fairness and data privacy. More specifically, the project will: (1) develop innovative statistical methods that ensure fairness in a finite-sample and distribution-free manner; (2) design algorithms that ensure fairness while complying with societal and legal constraints on sensitive data; (3) establish new frameworks to elucidate the trade-off between statistical accuracy and new privacy concepts in generative AI, including machine unlearning and copyright protection. Taken together, the outcome of this research will build a firm foundation of ethical statistical learning and shed light on the development of new theoretical understanding and practical methodology with algorithmic fairness and privacy guarantees.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2339829","CAREER: Statistical foundations of particle tracking and trajectory inference","DMS","STATISTICS","04/01/2024","01/19/2024","Jonathan Niles-Weed","NY","New York University","Continuing Grant","Yong Zeng","03/31/2029","$89,991.00","","jdw453@nyu.edu","70 WASHINGTON SQ S","NEW YORK","NY","100121019","2129982121","MPS","126900","1045","$0.00","Many problems in human microbiology, astronomy, high-energy physics, fluid dynamics, and aeronautics involve large collections of moving ""particles"" with complicated dynamics. Learning how these systems work requires developing statistical procedures for estimating these dynamics on the basis of noisy observations. The goal of this research is to develop scalable, practical, and reliable methods for this task, with a particular focus on developing statistical theory for applications in cosmology, cellular biology, and machine learning. This research will also include a large outreach component based on broadening access to research opportunities for undergraduates and graduate students.
The technical goals of this proposal are to develop computationally efficient estimators for multiple particle tracking in d dimensions when the particles evolve based on a known or unknown stochastic process, to develop Bayesian methods for posterior sampling based on observed trajectories, and to extend these methods to obtain minimax estimation procedures for smooth paths in the Wasserstein space of probability measures. The research also aims to develop estimators for more challenging models with the growth and interaction of particles.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2338464","CAREER: Distribution-Free and Adaptive Statistical Inference","DMS","STATISTICS","01/15/2024","01/11/2024","Lihua Lei","CA","Stanford University","Continuing Grant","Yulia Gel","12/31/2028","$75,564.00","","lihualei@stanford.edu","450 JANE STANFORD WAY","STANFORD","CA","943052004","6507232300","MPS","126900","1045","$0.00","Recent years have witnessed a growing trend across scientific disciplines to embrace complex modeling and black-box machine learning algorithms. Despite the remarkable success of handling complex data structures and fitting sophisticated regression functions, there remains a substantial gap regarding the integration of rigorous statistical principles into these pipelines. The main difficulty revolves around achieving reliable uncertainty quantification and robust statistical inference without artificially simplifying the complexity inherent in these advanced tools. Most existing frameworks that aim to bridge the gap rely on strong assumptions under which the machine learning algorithm can accurately estimate the data generating distribution. Nevertheless, these assumptions are often hard to justify, especially for modern machine learning algorithms that have yet to be fully understood. This research project aims to develop new frameworks for statistical inference that wrap around any machine learning algorithms or complex models without concerning about failure modes. The resulting methods are able to address the potential threats to inferential validity caused by black-box machine learning algorithms in a wide range of applied fields, including medicine, healthcare, economics, political science, epidemiology, and climate sciences. Open source software will also be developed to help applied researchers integrate rigorous statistical inference into their domain-specific modeling workflows without compromising the effectiveness of modern tools in non-inferential tasks. This may further alleviate hesitation in adopting modern machine learning methods and catalyze collaboration between scientific and engineering fields. Throughout the project, the PI will mentor undergraduate and graduate students, equipping them with solid understandings of statistical principles to become future leaders in face of rapidly evolving machine learning techniques.
This proposal will focus on distribution-free inference, which is immune to misspecification of parametric models, violation of nonparametric assumptions like smoothness or shape constraints, inaccuracy of asymptotic approximations due to limited sample size, high dimensionality, boundary cases, or irregularity. To avoid making uninformative decisions, an ideal distribution-free inference framework should also be adaptive to good modeling. This means that it should be as efficient as other frameworks that rely on distributional assumptions. Adaptivity alleviates the tradeoff between robustness and efficiency. The PI will develop distribution-free and adaptive inference frameworks for three specific problems. First, in causal inference, tighter identified set can be obtained for partially identified causal effects by incorporating pre-treatment covariates. However, existing frameworks for sharp inference require estimating conditional distributions of potential outcomes given covariates. The PI will develop a generic framework based on duality theory that is able to wrap around any estimates of conditional distributions and make distribution-free and adaptive inference. Second, many target parameters in medicine, political economy, and causal inference can be formulated through extremums of the conditional expectation of an outcome given covariates. In contrast to classical methods that impose distributional assumptions to enable consistent estimation of the conditional expectation, the PI will develop a distribution-free framework for testing statistical null hypotheses and constructing valid confidence intervals on the extremums directly. Finally, the use of complex models and prediction algorithms in time series nowcasting and forecasting presents challenges for reliable uncertainty quantification. To address this, the PI will develop a framework based on model predictive control and conformal prediction that is able to wrap around any forecasting algorithms and calibrate it to achieve long-term coverage, without any assumptions on the distribution of the time series. The ultimate goal of this research is to bring insights and present a suite of tools to empower statistical reasoning with machine learning and augment machine learning with statistical reasoning.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."
"2337943","CAREER: New data integration approaches for efficient and robust meta-estimation, model fusion and transfer learning","DMS","STATISTICS","06/01/2024","01/30/2024","Emily Hector","NC","North Carolina State University","Continuing Grant","Yulia Gel","05/31/2029","$85,143.00","","ehector@ncsu.edu","2601 WOLF VILLAGE WAY","RALEIGH","NC","276950001","9195152444","MPS","126900","1045","$0.00","Statistical science aims to learn about natural phenomena by drawing generalizable conclusions from an aggregate of similar experimental observations. With the recent ?Big Data? and ?Open Science? revolutions, scientists have shifted their focus from aggregating individual observations to aggregating massive publicly available datasets. This endeavor is premised on the hope of improving the robustness and generalizability of findings by combining information from multiple datasets. For example, combining data on rare disease outcomes across the United States can paint a more reliable picture than basing conclusions only on a small number of cases in one hospital. Similarly, combining data on disease risk factors across the United States can distinguish local from national health trends. To date, statistical approaches to these data aggregation objectives have been limited to simple settings with limited practical utility. In response to this gap, this project develops new methods for aggregating information from multiple datasets in three distinct data integration problems grounded in scientific practice. The developed approaches are intuitive, principled and robust to substantial differences between datasets, and are broadly applicable in medical, economic and social sciences, among others. Among other applications, the project will deliver new tools to extract health insights from large electronic health records databases. The project will support undergraduate and graduate student training, course development, and the recruitment and professional mentoring of under-represented minorities in statistics. Further, the project will impact STEM education through a data science teacher training program in underserved communities.
This project develops intuitive, principled, robust and efficient methods in three essential data integration problems: meta-analysis, model fusion and transfer learning. First, the project delivers a set of meta-analysis methods for privacy-preserving one-shot estimation and inference using a new notion of dataset similarity. The primary novelty in the approach is the joint estimation of both dataset-specific parameters and a combined parameter that bears some similarity to the classic meta-estimator. Second, the project establishes model fusion methods that learn the clustering of similar datasets. The methods? unique feature is a model fusion that dials data integration along a spectrum of more to less fusion and thereby does not force model parameters from clustered datasets to be exactly equal. Third, the project develops flexible and robust transfer learning approaches that leverage historical information for improved statistical efficiency in a target dataset of interest. An important element of these approaches is a flexible specification of the type of models fit to the source datasets. All three sets of methods place a premium on interpretability, statistical efficiency and robustness of the inferential output. The project unifies the three sets of proposed methods under a formal data integration framework formulated around two axioms of data integration. Data integration ideas pervade every field of scientific study in which data are collected, and so the research contributes to scientific endeavors in the medical, economic and social sciences, among others.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria."