Vitaly Feldman photo

Research Areas

Contact Information

Vitaly Feldman
Research staff member
Almaden Research Center, San Jose, CA, USA
vitaly.eduatgmail.com      +1dash408dash927dash1013


Tab navigation

I'm a research scientist in CS Theory Group at IBM Almaden Research Center.

In spring of 2014 I'm also a visiting scientist at Simons Institute for the Theory of Computing, UC Berkeley.

Before joining IBM in Aug 2007 I spent 5 very enjoyable years at Harvard University as a PhD student advised by Prof. Leslie Valiant and as a postdoc.

Previously I studied at the Technion from which I received BA and MSc in CS (advised by Nader Bshouty) and worked at IBM Research in Haifa.


Research

My research interests are primarily in Computational Learning Theory (Wikipedia entry) and Computational Complexity. I also work on understanding of natural learning systems: learning by the brain and evolution as learning. This work is based on the models pioneered by Leslie Valiant (brain, evolvability).

Here are some of my recent works, Ph.D. thesis and surveys and complete list of publications (with abstracts).

Recent/upcoming conference program committees: ALT 2014, COLT 2014 , ALT 2013, COLT 2013, ALT 2012, COLT 2012, UAI 2011, SODA 2011.


Recent work

  1. Sample Complexity Bounds on Differentially Private Learning via Communication Complexity .
    With David Xiao. Manuscript. Feb, 2014.
  2. On the Complexity of Random Satisfiability Problems with Planted Solutions .
    With Will Perkins and Santosh Vempala. Manuscript. Nov, 2013 .
  3. Learning Coverage Functions and Private Release of Marginals.
    With Pravesh Kothari. Manuscript. Feb, 2014.
  4. Statistical Active Learning Algorithms for Noise Tolerance and Differential Privacy.
    With Nina Balcan. NIPS 2013 . Submitted to Algorithmica (by invitation)
  5. Optimal Bounds on Approximation of Submodular and XOS Functions by Juntas.
    With Jan Vondrak. FOCS 2013 . Submitted to SICOMP Special Issue on FOCS (by invitation), SIGecom Exchanges 12.2 (invited letter)
  6. Learning using Local Membership Queries.
    With Pranjal Awasthi and Varun Kanade. COLT 2013, Best Student (Co-authored) Paper Award .
  7. Representation, Approximation and Learning of Submodular Functions Using Low-rank Decision Trees.
    With Pravesh Kothari and Jan Vondrak. COLT 2013 .
  8. Provenance-based Dictionary Refinement in Information Extraction.
    With Sudeepa Roy, Laura Chiticariu, Frederick Reiss and Huaiyu Zhu. SIGMOD 2013 .
  9. Statistical Algorithms and a Lower Bound for Detecting Planted Cliques.
    With Elena Grigorescu, Lev Reyzin, Santosh Vempala and Ying Xiao. STOC 2013 .
  10. Computational Bounds on Statistical Query Learning.
    With Varun Kanade. COLT 2012.
  11. Nearly Optimal Solutions for the Chow Parameters Problem and Low-weight Approximation of Halfspaces.
    With Anindya De, Ilias Diakonikolas and Rocco Servedio. STOC 2012; JACM (to appear) .
  12. Learning DNF Expressions from Fourier Spectrum.
    COLT 2012.
  13. Distribution-Independent Evolvability of Linear Threshold Functions.
    COLT 2011.
  14. Lower Bounds and Hardness Amplification for Learning Shallow Monotone Formulas.
    With Homin Lee and Rocco Servedio. COLT 2011; JMLR 2014(to appear).
  15. Distribution-Specific Agnostic Boosting.
    ICS 2010.
  16. Agnostic Learning of Monomials by Halfspaces is Hard.
    With Venkatesan Guruswami, Prasad Raghavendra and Yi Wu. FOCS 2009; SICOMP 2012 .
  17. A Complete Characterization of Statistical Query Learning with Applications to Evolvability.
    FOCS 2009; JCSS 2012 (by invitation).
  18. Sorting and Selection with Imprecise Comparisons.
    With Miklos Ajtai, Avinatan Hassidim and Jelani Nelson. ICALP A, 2009. Full version is now available (Feb. 2012)
  19. Robustness of Evolvability.
    COLT 2009.
  20. Experience-Induced Neural Circuits That Achieve High Capacity..
    With Leslie Valiant. Neural Computation 21:10, 2009.
  21. On The Power of Membership Queries in Agnostic Learning.
    COLT 2008; JMLR 10, 2009
  22. Evolvability from Learning Algorithms.
    STOC 2008.
  23. Separating Models of Learning with Faulty Teachers.
    With Shrenik Shah. ALT 2007; TCS 410(19), 2009 (by invitation)
  24. New Results for Learning Noisy Parities and Halfspaces.
    With Parikshit Gopalan, Subhash Khot, and Ashok Ponnuswami. FOCS 2006; SICOMP 39(2), 2009 (by invitation)
  25. Optimal Hardness Results for Maximizing Agreements with Monomials.
    CCC 2006; SICOMP 39(2), 2009 (by invitation)
  26. Hardness of Approximate Two-level Logic Minimization and PAC Learning with Membership Queries.
    STOC 2006; JCSS 75(1), 2009 (by invitation)
  27. Attribute Efficient and Non-adaptive Learning of Parities and DNF Expressions.
    COLT 2005, Best Student Paper Award; JMLR 8, 2007 (by invitation)
  28. The Complexity of Properly Learning Simple Concept Classes.
    With Misha Alekhnovich, Mark Braverman, Adam Klivans, and Toni Pitassi.
    FOCS 2004; JCSS 74(1), 2008 (by invitation)

Ph.D. thesis. Efficiency and Computational Limitations of Learning Algorithms. Harvard University. January 2007

Surveys:

  1. The Learning Power of Evolution.
    With Leslie Valiant. Open Problems/New Directions at COLT 2008.
  2. Hardness of Proper Learning.
    The Encyclopedia of Algorithms. Springer-Verlag, 2008
  3. Statistical Query Learning.
    The Encyclopedia of Algorithms. Springer-Verlag, 2008

A bit more about me

I spend a lot of time in the wonderful company of Polina, Aviv and tiny Milan.

Some of my favorite hobbies are ballroom dancing, mountain biking, photography, hiking, skiing, drinking tea and visiting Paris.