Almaden Research Center, San Jose, CA, USA
In spring of 2014 I'm also a visiting scientist at Simons Institute for the Theory of Computing, UC Berkeley.
Before joining IBM in Aug 2007 I spent 5 very enjoyable years at Harvard University as a PhD student advised by Prof. Leslie Valiant and as a postdoc.
My research interests are primarily in Computational Learning Theory (Wikipedia entry) and Computational Complexity. I also work on understanding of natural learning systems: learning by the brain and evolution as learning. This work is based on the models pioneered by Leslie Valiant (brain, evolvability).
- Sample Complexity Bounds on Differentially Private Learning via Communication Complexity
With David Xiao. Manuscript. Feb, 2014.
- On the Complexity of Random Satisfiability Problems with Planted Solutions .
With Will Perkins and Santosh Vempala. Manuscript. Nov, 2013 .
- Learning Coverage Functions and Private Release of Marginals.
With Pravesh Kothari. Manuscript. Feb, 2014.
- Statistical Active Learning Algorithms for Noise Tolerance and Differential Privacy.
With Nina Balcan. NIPS 2013 . Submitted to Algorithmica (by invitation)
- Optimal Bounds on Approximation of Submodular and XOS Functions by Juntas.
With Jan Vondrak. FOCS 2013 . Submitted to SICOMP Special Issue on FOCS (by invitation), SIGecom Exchanges 12.2 (invited letter)
- Learning using Local Membership Queries.
With Pranjal Awasthi and Varun Kanade. COLT 2013, Best Student (Co-authored) Paper Award .
- Representation, Approximation and Learning of Submodular Functions Using Low-rank Decision Trees.
With Pravesh Kothari and Jan Vondrak. COLT 2013 .
- Provenance-based Dictionary Refinement in Information Extraction.
With Sudeepa Roy, Laura Chiticariu, Frederick Reiss and Huaiyu Zhu. SIGMOD 2013 .
- Statistical Algorithms and a Lower Bound for Detecting Planted Cliques.
With Elena Grigorescu, Lev Reyzin, Santosh Vempala and Ying Xiao. STOC 2013 .
- Computational Bounds on Statistical Query Learning.
With Varun Kanade. COLT 2012.
- Nearly Optimal Solutions for the Chow Parameters Problem and Low-weight Approximation of Halfspaces.
With Anindya De, Ilias Diakonikolas and Rocco Servedio. STOC 2012; JACM (to appear) .
- Learning DNF Expressions from Fourier Spectrum.
- Distribution-Independent Evolvability of Linear Threshold Functions.
- Lower Bounds and Hardness Amplification for Learning Shallow Monotone Formulas.
With Homin Lee and Rocco Servedio. COLT 2011; JMLR 2014(to appear).
- Distribution-Specific Agnostic Boosting.
- Agnostic Learning of Monomials by Halfspaces is Hard.
With Venkatesan Guruswami, Prasad Raghavendra and Yi Wu. FOCS 2009; SICOMP 2012 .
- A Complete Characterization of Statistical Query Learning with Applications to Evolvability.
FOCS 2009; JCSS 2012 (by invitation).
- Sorting and Selection with Imprecise Comparisons.
With Miklos Ajtai, Avinatan Hassidim and Jelani Nelson. ICALP A, 2009. Full version is now available (Feb. 2012)
- Robustness of Evolvability.
- Experience-Induced Neural Circuits That Achieve High Capacity..
With Leslie Valiant. Neural Computation 21:10, 2009.
- On The Power of Membership Queries in Agnostic Learning.
COLT 2008; JMLR 10, 2009
- Evolvability from Learning Algorithms.
- Separating Models of Learning with Faulty Teachers.
With Shrenik Shah. ALT 2007; TCS 410(19), 2009 (by invitation)
- New Results for Learning Noisy Parities and Halfspaces.
With Parikshit Gopalan, Subhash Khot, and Ashok Ponnuswami. FOCS 2006; SICOMP 39(2), 2009 (by invitation)
- Optimal Hardness Results for Maximizing Agreements with Monomials.
CCC 2006; SICOMP 39(2), 2009 (by invitation)
- Hardness of Approximate Two-level Logic Minimization and PAC Learning with Membership Queries.
STOC 2006; JCSS 75(1), 2009 (by invitation)
- Attribute Efficient and Non-adaptive Learning of Parities and DNF Expressions.
COLT 2005, Best Student Paper Award; JMLR 8, 2007 (by invitation)
- The Complexity of Properly Learning Simple Concept Classes.
With Misha Alekhnovich, Mark Braverman, Adam Klivans, and Toni Pitassi.
FOCS 2004; JCSS 74(1), 2008 (by invitation)
Ph.D. thesis. Efficiency and Computational Limitations of Learning Algorithms. Harvard University. January 2007
A bit more about me