I work on algorithmic statistics, with a focus on unsupervised and minimally supervised learning.
For a more complete list of publications, click here.
S. Dasgupta. A cost function for similarity-based hierarchical clustering.
Forty-Eighth ACM Symposium on Theory of Computing (STOC), 2016.
S. Dasgupta and K. Sinha. Randomized partition trees for nearest neighbor search.
Algorithmica, 72(1): 237-263, 2015.
K. Chaudhuri and S. Dasgupta. Rates of convergence for nearest neighbor classification.
Neural Information Processing Systems (NIPS), 2014.
S. Dasgupta. Two faces of active learning.
Theoretical Computer Science, 412(19): 1767-1781, 2011.
K. Chaudhuri and S. Dasgupta. Rates of convergence for the cluster tree.
Neural Information Processing Systems (NIPS), 2010.
S. Dasgupta and Y. Freund. Random projection trees and low dimensional manifolds.
Fortieth ACM Symposium on Theory of Computing (STOC), 2008.
S. Dasgupta. Coarse sample complexity bounds for active learning.
Nineteenth Annual Conference on Neural Information Processing Systems (NIPS), 2005.
S. Dasgupta and P.M. Long. Performance guarantees for hierarchical clustering.
Journal of Computer and System Sciences, 70(4):555-569, 2005.
S. Dasgupta and A. Gupta. An elementary proof of a theorem of Johnson and Lindenstrauss.
Random Structures and Algorithms, 22(1):60-65, 2003.
S. Dasgupta and L.J. Schulman. A two-round variant of EM for Gaussian mixtures.
Sixteenth Conference on Uncertainty in Artificial Intelligence (UAI), 2000.
S. Dasgupta. Learning mixtures of Gaussians.
Fortieth Annual IEEE Symposium on Foundations of Computer Science (FOCS), 1999.