About Me

visualization of residual of iterative projection method for linear inequalities

I am an Assistant Professor in the Mathematics Department at Harvey Mudd College. My research focuses are in mathematical data science, optimization, and applied convex geometry. I leverage mathematical tools, such as those from probability, combinatorics, and convex geometry, on problems in data science and optimization. Areas in which I have been active recently include randomized numerical linear algebra, combinatorial methods for convex optimization, tensor decomposition for topic modeling, network consensus and ranking problems, and community detection on graphs and hypergraphs. My research is supported by NSF DMS #2211318 “Tensor Models, Methods, and Medicine”.

Before starting at HMC, I received my PhD in the Graduate Group in Applied Mathematics at the University of California, Davis where I was fortunate to be advised by Professor Jesús A. De Loera, and then was a CAM Assistant Professor (post-doc) in the University of California, Los Angeles (UCLA) Mathematics Department where my exceptional postdoctoral mentor was Professor Deanna Needell.


Recent News

August ‘23: I have a few talks and trips this Fall! I will be speaking in the One World Mathematics of INformation, Data, and Signals (1W-MINDS) Seminar on September 7th at 11:30 am PT on Zoom. My talk is titled “Randomized Kaczmarz Methods: Corruption, Consensus, and Concentration.” Later in the month, I will speak at the Fifty-Ninth Annual Allerton Conference on Communication, Control, and Computing in Monticello, IL on our (with collaborators Anna Ma and Liza Rebrova) paper “On Subsampled Quantile Randomized Kaczmarz”. After that, I go to Atlanta, GA for the AWM Research Symposium where I will be speaking in the session “Tensor Methods for data modeling” on Saturday, September 30. My talk is titled “Hierarchical nonnegative tensor factorizations and applications.” The last leg of the trip is to Boston, MA where I will visit Wellesley College to give a talk on Monday, October 2.

August ‘23: Our (with collaborators Anna Ma and Liza Rebrova) paper “On Subsampled Quantile Randomized Kaczmarz” was accepted to the the Fifty-Ninth Annual Allerton Conference on Communication, Control, and Computing to be held in Monticello, Illinois during September 26-29, 2023. In this paper, we provide theoretical guarantees for the Quantile Randomized Kaczmarz method using subsamples of the linear system residual in each iteration. Previous guarantees required computing the entire residual which is often (or nearly always) infeasible. We’re excited to participate in this wonderful conference!

June ‘23: I’ve published all course materials for my course “Mathematical Data Science & Topic Modeling”! All slides, code, and assignments are available and free to be used, edited, and shared. If you find use of these materials, please cite this repository and consider letting me know your impressions and any typos/errors you catch!

May ‘23: I’m serving on the organizing committee for the third SIAM Conference on the Mathematics of Data Science (SIAM MDS) in 2024! SIAM MDS is a biennial conference of the SIAM Activity Group on Data Science and “aims to bring together those who are building foundations for data science and its applications across science, engineering, technology, and society.” The organizing committee shapes and defines the scientific program of the conference, e.g., identify invited speakers, choose mini-tutorials, etc. I’m very excited to help plan this amazing conference!

April ‘23: I was featured in the April 2023 Academic Data Science Alliance (ADSA) Career Development Network Round-up (newsletter) and blog!

March ‘23: We (with students Tyler Will, Joshua Vendrow, Runyu Zhang, Mengdi Gao, and Eli Sadovnik, and colleagues Denali Molitor and Deanna Needell) submitted our paper Neural Nonnegative Matrix Factorization for Hierarchical Multilayer Topic Modeling! In this paper, we introduce a new model based on nonnegative matrix factorization (NMF) for detecting latent hierarchical structure in data, which we call Neural NMF. This model frames hierarchical NMF as a neural network, and we provide theoretical results which allow us to train Neural NMF via a natural backpropagation method. We illustrate the promise of this model with several numerical experiments on benchmark datasets and real world data.