University of California, San Diego
9500 Gilman Drive, La Jolla, CA 92092
Research Overview
Welcome! I am a PhD student at UC San Diego, working in machine learning. I am advised by Yoav Freund, and received my M.S. in Computer Science from UCSD in spring 2013. My current research interests include developing semisupervised algorithms to combine ensembles of predictors, and studying sequential processes and learning.
I am on the job market. My research statement and CV have more information.
Research Manuscripts

Muffled SemiSupervised Learning. [arXiv]
There are several ways to achieve significant offtheshelf improvements on supervised classification performance using unlabeled data, not by imputing labels on unlabeled data which tend to agree with the supervised recommendations, but by "muffling" these recommendations by imputing the opposite labels instead.
Preprint. 
Learning to Abstain from Binary Prediction. [arXiv]
The problem of binary classification with an abstaining predictor centers around the tradeoff between abstaining and making a prediction error, which we optimally characterize theoretically and with efficient algorithms that use labeled and unlabeled data.
Preprint. 
Sequential Nonparametric Testing with the Law of the Iterated Logarithm. [arXiv]
When performing nonparametric testing of the difference in mean between two samples (and many other problems besides), we devise rigorous sequential tests in a classical Waldstyle framework that stop as soon as possible, adapting to the unknown mean difference.
Conference on Uncertainty in Artificial Intelligence (UAI), 2016. 
InstanceDependent Regret Bounds for Dueling Bandits.
Online learning from limited (bandit) pairwise feedback between actions is easy when a few actions are better than the rest and the matrix of pairwise preferences is wellconditioned.
Conference on Learning Theory (COLT), 2016. 
Optimal Binary Classifier Aggregation for General Losses. [arXiv]
The minimax optimal way to combine a set of binary classifiers of varying competences with unlabeled data is an artificial neuron, with a sigmoidshaped transfer function that only depends on the evaluation loss function, for any convex and many nonconvex losses.
Short version in Workshop on Learning Faster from Easy Data, NIPS, 2015. 
Scalable SemiSupervised Aggregation of Classifiers. [arXiv]
There is an efficient way to use unlabeled data to combine the trees of a random forest, which often performs better than random forests for binary classification.
Neural Information Processing Systems (NIPS), 2015.

Optimally Combining Classifiers Using Unlabeled Data. [arXiv]
The minimax optimal way to combine a set of binary classifiers of known competences with unlabeled data resembles a weighted majority vote, and is efficiently learnable.
Conference on Learning Theory (COLT), 2015.

PACBayes Iterated Logarithm Bounds for Martingale Mixtures. [arXiv]
Any mixture of random walks with high probability stays within an optimally characterized range of its mean according to a known prior distribution, at all times along its sample path, and with respect to all posterior distributions.

Sharp FiniteTime IteratedLogarithm Martingale Concentration. [arXiv]
Any random walk with high probability stays within a narrow, optimally characterized range of its mean, at all times along its sample path.
Submitted to The Annals of Probability, 2015.

The Fast Convergence of Incremental PCA. [arXiv]
Natural algorithms for incremental lineartime and space principal component analysis (PCA) converge quickly to the optimum, despite the problem's nonconvexity.
Neural Information Processing Systems (NIPS), 2013.

An Empirical Comparison of Sparse vs. Embedding Techniques on ManyClass Text Classification.
Workshop on Extreme Classification, NIPS, 2013. 
The Utility of Abstaining in Binary Classification.
Research Exam (requirement for M.S.), UC San Diego. March 2013.
Biography
Before the PhD, I was an Associate at Strand Life Sciences, where I did statistical genomics, developing tools for genomics researchers. Previously, I received a B.S. (High Honors) in Electrical Engineering and Computer Science at UC Berkeley in December 2008. On the way to that degree, I minored in (quantum) physics at Berkeley as well. Before that, I lived in various parts of India, the US, and Singapore.
Miscellaneous
I used to play the violin (and occasionally still do); before college, I happened to do a certification in it (unfortunately recordings are lost!). I also played the Carnatic classical style, which is less polyphonic but melodically far richer.
I have always enjoyed traveling and do so whenever the opportunity arises. In my free time, I sometimes write on history and philosophy tidbits I find interesting (links to come).
This site is (still and perennially) under construction.