Automatic taxonomy generation: Issues and possibilities
Raghu Krishnapuram, Krishna Kummamuru
IFSA 2003
Motivated by recent work on stochastic gradient descent methods, we develop two stochastic variants of greedy algorithms for possibly non-convex optimization problems with sparsity constraints. We prove linear convergence1 in expectation to the solution within a specified tolerance. This generalized framework is specialized to the problems of sparse signal recovery in compressed sensing and low-rank matrix recovery, giving methods with provable convergence guarantees that often outperform their deterministic counterparts. We also analyze the settings, where gradients and projections can only be computed approximately, and prove the methods are robust to these approximations. We include many numerical experiments, which align with the theoretical analysis and demonstrate these improvements in several different settings.
Raghu Krishnapuram, Krishna Kummamuru
IFSA 2003
Frank R. Libsch, Takatoshi Tsujimura
Active Matrix Liquid Crystal Displays Technology and Applications 1997
Heinz Koeppl, Marc Hafner, et al.
BMC Bioinformatics
Reena Elangovan, Shubham Jain, et al.
ACM TODAES