I am an Assistant Professor in the Mathematics Department at Harvey Mudd College. My research focuses are in mathematical data science, optimization, and applied convex geometry. I leverage mathematical tools, such as those from probability, combinatorics, and convex geometry, on problems in data science and optimization. Areas in which I have been active recently include randomized numerical linear algebra, combinatorial methods for convex optimization, tensor decomposition for topic modeling, network consensus and ranking problems, and community detection on graphs and hypergraphs.
Before starting at HMC, I received my PhD in the Graduate Group in Applied Mathematics at the University of California, Davis where I was fortunate to be advised by Professor Jesús A. De Loera, and then was a CAM Assistant Professor (post-doc) in the University of California, Los Angeles (UCLA) Mathematics Department where my exceptional postdoctoral mentor was Professor Deanna Needell.
August ‘22: Our paper (with student Chen Yap and collaborator Ben Jarman) Paving the Way for Consensus: Convergence of Block Gossip Algorithms was accepted to the IEEE Transactions on Information Theory journal! In this paper, we prove a new convergence bound for a broader class of randomized block Kaczmarz methods on a broader class of inconsistent linear systems, then utilize this convergence bound to prove convergence of the block gossip methods for average consensus. We additionally specify the result to three popular types of block gossip protocols which utilize specific subgraph structures to iteratively update towards consensus.
June ‘22: In 2022-2023, I am co-organizing the One World Mathematics of Information, Data, and Signals (MINDS) Seminar! Given the impossibility of travel during the COVID-19 crisis the One World MINDS seminar was founded as an inter-institutional global online seminar aimed at giving researchers interested in mathematical data science, computational harmonic analysis, and related applications access to high quality talks. Talks are held on Thursdays either at 2:30 pm New York time or at 4:30 pm Shanghai/ 9:30 am (summer 10:30 am) Paris time.
June ‘22: Our (with co-organizer Phil Chodrow) minisymposium on “Tensor Methods for Network Data Science” was accepted to the SIAM Conference on the Mathematics of Data Science (MDS22) which is to be held in San Diego, CA in September 2022! We have four fabulous speakers, Izabel P. Aguiar (Stanford University), Tamara Kolda (MathSci.ai), “Bill” Feng Shi (TigerGraph), and Francesco Tudisco (GSSI), who will speak about the exciting new developments of tensor-based methods for data science problems related to networks.
June ‘22: This month I have the honor of speaking in the Harvey Mudd College Stauffer Lecture series where I will describe my work in “Tensor Models, Methods, and Medicine.” Additionally, I will speak (virtually) to the Rice University “Data Scientists in Training” Outreach Program where I will describe my path to research in mathematical data science!
May ‘22: Check out this amazing video my summer 2021 research student Hannah Kaufman made! In it, she illustrates how the Kaczmarz method works for solving linear systems, and presents an application to the problem of rating items according to pairwise comparison information. Way to go, Hannah!!
April ‘22: We (with collaborators Phil Chodrow and Nicole Eikmeier) submitted our paper Nonbacktracking spectral clustering of nonuniform hypergraphs! In this paper, we propose methods for community detection on nonuniform (containing edges of different sizes) hypergraphs – one is a simple spectral approach using the nonbacktracking operator and the other is an alternating approach based upon linearized belief-propagation (the nonbacktracking operator appears here too!). We additionally provide some theorems that improve computational complexity of working with the nonbacktracking operator and other large matrices appearing in our methods.
April ‘22: Our (with student Edwin Chau) paper On Application of Block Kaczmarz Methods in Matrix Factorization was accepted to SIAM Undergraduate Research Online (SIURO)! In this work, we discuss and test a block Kaczmarz solver that replaces the least-squares subroutine in the common alternating scheme for matrix factorization. This variant trades a small increase in factorization error for significantly faster algorithmic performance. In doing so we find block sizes that produce a solution comparable to that of the least-squares solver for only a fraction of the runtime and working memory requirement!