I am currently a Pre-Doctoral Researcher at Google DeepMind, India. Prior to this, I was an undergrad at Indian Institute of Technology Kanpur (IITK) double majoring in Electrical Engineering and Mathematics. Towards the tail end of my time at IITK, I interned at the Max Planck Institute for Intelligent Systems (MPI-IS) under the supervision of Michael Muehlebach and Bernhard Schölkopf; and the Tata Institute of Fundamental Research under the supervision of Sandeep Juneja
I’m broadly interested in the union of Probability, Geometry and Theoretical Computer Science. Specific areas of interest include:
Sampling and Markov Chains : Algorithms and Lower Bounds for Sampling from High Dimensional Gibbs Measures and Convex Bodies, Spin Systems, Bakry-Emery Theory
Applied Probability : Phase Transitions in Random Structures, Information-Computation Tradeoffs in High Dimensional Statistics, Optimal Transport Theory, Interacting Particle Systems
Dynamical Systems and Optimization : Euclidean and Wasserstein Gradient Flows, applications to Continuous Optimization and Statistical Learning
Much of my recent work has focused on the applications of these ideas to the analysis of widely used Machine Learning algorithms.
Provably Fast Finite-Particle Variants of SVGD via Virtual Particle Stochastic Approximation
with Dheeraj Nagaraj
Spotlight at NeurIPS 2023 [Paper]
Oral Presentation at OTML Workshop, NeurIPS 2023
Utilising the CLT Structure in Stochastic Gradient based Sampling : Improved Analysis and Faster Algorithms
with Dheeraj Nagaraj and Anant Raj
COLT 2023 [Paper]
Near Optimal Heteroscedastic Regression with Symbiotic Learning
with Dheeraj Baby, Dheeraj Nagaraj and Praneeth Netrapalli
COLT 2023 [Paper]
Sampling without Replacement Leads to Faster Rates in Finite-Sum Minimax Optimization
with Bernhard Schölkopf and Michael Muehlebach
NeurIPS 2022 [Paper]
Powered by Jekyll and Minimal Light theme.