I am a first year PhD student at Stanford CS. Prior to this, I was a Pre-Doctoral Researcher at Google DeepMind. Before that, I was a dazed and confused undergrad at Indian Institute of Technology Kanpur, where I double majored in Electrical Engineering and Math.
I’m broadly interested in the design and analysis of algorithms by applying ideas from probability and geometry. Specific interests include the analysis of Markov Chains, complexity of statistical inference, continuous optimization and Wasserstein Gradient Flows.
Near-Optimal Streaming Heavy-Tailed Statistical Estimation with Clipped SGD
with Dheeraj Nagaraj, Soumyabrata Pal, Arun Suggala and Prateek Varshney
NeurIPS 2024 [Paper]
Provably Fast Finite-Particle Variants of SVGD via Virtual Particle Stochastic Approximation
with Dheeraj Nagaraj
Spotlight at NeurIPS 2023 [Paper]
Oral Presentation at OTML Workshop, NeurIPS 2023
Utilising the CLT Structure in Stochastic Gradient based Sampling : Improved Analysis and Faster Algorithms
with Dheeraj Nagaraj and Anant Raj
COLT 2023 [Paper]
Near Optimal Heteroscedastic Regression with Symbiotic Learning
with Dheeraj Baby, Dheeraj Nagaraj and Praneeth Netrapalli
COLT 2023 [Paper]
Sampling without Replacement Leads to Faster Rates in Finite-Sum Minimax Optimization
with Bernhard Schölkopf and Michael Muehlebach
NeurIPS 2022 [Paper]
Powered by Jekyll and Minimal Light theme.