I am an Assistant Professor in the Mathematics Department at Harvey Mudd College. My research focuses are in mathematical data science, optimization, and applied convex geometry. I leverage mathematical tools, such as those from probability, combinatorics, and convex geometry, on problems in data science and optimization. Areas in which I have been active recently include randomized numerical linear algebra, combinatorial methods for convex optimization, tensor decomposition for topic modeling, network consensus and ranking problems, and community detection on graphs and hypergraphs.
Before starting at HMC, I received my PhD in the Graduate Group in Applied Mathematics at the University of California, Davis where I was fortunate to be advised by Professor Jesús A. De Loera, and then was a CAM Assistant Professor (post-doc) in the University of California, Los Angeles (UCLA) Mathematics Department where my exceptional postdoctoral mentor was Professor Deanna Needell.
May ‘22: Check out this amazing video my summer 2021 research student Hannah Kaufman made! In it, she illustrates how the Kaczmarz method works for solving linear systems, and presents an application to the problem of rating items according to pairwise comparison information. Way to go, Hannah!!
April ‘22: We (with collaborators Phil Chodrow and Nicole Eikmeier) submitted our paper Nonbacktracking spectral clustering of nonuniform hypergraphs! In this paper, we propose methods for community detection on nonuniform (containing edges of different sizes) hypergraphs – one is a simple spectral approach using the nonbacktracking operator and the other is an alternating approach based upon linearized belief-propagation (the nonbacktracking operator appears here too!). We additionally provide some theorems that improve computational complexity of working with the nonbacktracking operator and other large matrices appearing in our methods.
April ‘22: Our (with student Edwin Chau) paper On Application of Block Kaczmarz Methods in Matrix Factorization was accepted to SIAM Undergraduate Research Online (SIURO)! In this work, we discuss and test a block Kaczmarz solver that replaces the least-squares subroutine in the common alternating scheme for matrix factorization. This variant trades a small increase in factorization error for significantly faster algorithmic performance. In doing so we find block sizes that produce a solution comparable to that of the least-squares solver for only a fraction of the runtime and working memory requirement!
March ‘22: I am participating in the IAS Women and Mathematics “The Mathematics of Machine Learning” at the Institute of Advanced Studies in Princeton, NJ! This program will be held from May 21-27, 2022 and will feature lectures by Cynthia Rudin (Duke University) and Maria-Florina Balcan (Carnegie Mellon University). Looking forward to meeting an exceptional group of faculty and students and learning about some great new topics!
March ‘22: I am participating in the AMS MRC program “Models and Methods for Sparse (Hyper)Network Science “ as an assistant to the organizers! This program will be held June 5-11, 2022 at Beaver Hollow Conference Center in Java Center, NY and will deal with graph and hypergraph models and their applications to real world study of critical systems. Looking forward to the opportunity to learn and network with this exceptional community.
March ‘22: I am co-organizing the Southern California Applied Mathematics Symposium (SOCAMS) with Heather Zinn-Brooks (Harvey Mudd College), Christina Edholm (Scripps College), Manuchehr Aminian (Cal Poly Pomona), Phil Chodrow (UCLA), Anna Ma (UCI), Adam MacLean (USC), Chris Miles (UCI), and Alona Kryshchenko (CSU Channel Islands). This one-day meeting will be held on the campus of Harvey Mudd College on May 21, 2022! This conference aims to bring together researchers from universities throughout Southern California, working in all areas of Applied Mathematics, for a one-day exchange of ideas in an informal and collaborative atmosphere. More information and registration available at https://www.socams.org”!
February ‘22: Elizaveta Rebrova (Princeton Univ. ORFE) and I are organizing sessions titled “Randomized Iterative Methods beyond Least-squares” and “Tensor Modeling and Optimization” for the “Optimization for Data Science and Machine learning” cluster at the seventh International Conference on Continuous Optimization (ICCOPT) which will take place at Lehigh University in Bethlehem, Pennsylvania during July 25-28, 2022. We have two great slates of speakers organized to speak on these topics!
February ‘22: Applications are now open for my funded summer undergraduate research projects Tensor Models and Methods for Medical Imaging, Numerical Linear Algebraic Analyses of Opinion Dynamics on Networks, and Iterative Methods for Large-scale Systems of Linear Equations! Applications may be submitted via the HMC URO portal until February 20.
January ‘22: Our (with student Josh Vendrow) paper A Generalized Hierarchical Nonnegative Tensor Decomposition was accepted to the 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)! In this paper, we propose a hierarchical tensor decomposition model that generalizes a natural model for matrices, a property which many hierarchical tensor decomposition models lack. This model naturally illuminates the hierarchy of latent topics in tensor-structured data.
January ‘22: I am coorganizing the MAA Session “Establishing Interdisciplinary Collaborations in Teaching and Research” at the Joint Mathematics Meeting (virtual) April 6-9, 2022 with Jessica Oehrlein (Fitchburg State University)! Due to the transition to virtual format and the challenges of scheduling, this session will occur in March. We have a great set of speakers who will lead an interactive session on how beginning and sustaining interdisciplinary collaboration with academics outside mathematics and industrial colleagues.
December ‘21: Our paper Quantile-based Iterative Methods for Corrupted Systems of Linear Equations was accepted for publication in SIAM Journal on Matrix Analysis and Applications (SIMAX)! In this paper, we propose iterative methods for solving large-scale and arbitrarily corrupted systems of equations. We provide both theoretical and empirical evidence of the promise of these methods; our theoretical results build upon new and classical results in high-dimensional probability.
November ‘21: We (with student Chen Yap) submitted our paper Paving the Way for Consensus: Convergence of Block Gossip Algorithms! In this paper, we prove a new convergence bound for a broader class of randomized block Kaczmarz methods on a broader class of inconsistent linear systems, then utilize this convergence bound to prove convergence of the block gossip methods for average consensus. We additionally specify the result to three popular types of block gossip protocols which utilize specific subgraph structures to iteratively update towards consensus.