Tuesday, June 02, 2015

ICML 2015 papers are out


The proceedings for ICML 2015 which is to take place in Lille, is out. Here is a small sample of papers that we mentioned before or are of interest to the general themes covered on Nuit Blanche:

Towards a Lower Sample Complexity for Robust One-bit Compressed Sensing
Rongda Zhu, Quanquan Gu
A Nearly-Linear Time Framework for Graph-Structured Sparsity
Chinmay Hegde, Piotr Indyk, Ludwig Schmidt
Large-scale log-determinant computation through stochastic Chebyshev expansions
Insu Han, Dmitry Malioutov, Jinwoo Shin
Compressing Neural Networks with the Hashing Trick
Wenlin Chen, James Wilson, Stephen Tyree, Kilian Weinberger, Yixin Chen
Multi-view Sparse Co-clustering via Proximal Alternating Linearized Minimization
Jiangwen Sun, Jin Lu, Tingyang Xu, Jinbo Bi
Learning Word Representations with Hierarchical Sparse Coding
Dani Yogatama, Manaal Faruqui, Chris Dyer, Noah Smith
Theory of Dual-sparse Regularized Randomized Reduction
Tianbao Yang, Lijun Zhang, Rong Jin, Shenghuo Zhu
Streaming Sparse Principal Component Analysis
Wenzhuo Yang, Huan Xu
Multi-view Sparse Co-clustering via Proximal Alternating Linearized Minimization
Jiangwen Sun, Jin Lu, Tingyang Xu, Jinbo Bi
Inferring Graphs from Cascades: A Sparse Recovery Framework
Jean Pouget-Abadie, Thibaut Horel
Swept Approximate Message Passing for Sparse Estimation
Andre Manoel, Florent Krzakala, Eric Tramel, Lenka Zdeborovà
Blitz: A Principled Meta-Algorithm for Scaling Sparse Optimization
Tyler Johnson, Carlos Guestrin
Sparse Variational Inference for Generalized GP Models
Rishit Sheth, Yuyang Wang, Roni Khardon
A Deterministic Analysis of Noisy Sparse Subspace Clustering for Dimensionality-reduced Data
Yining Wang, Yu-Xiang Wang, Aarti Singh
Geometric Conditions for Subspace-Sparse Recovery
Chong You, Rene Vidal
Scaling up Natural Gradient by Sparsely Factorizing the Inverse Fisher Matrix
Roger Grosse, Ruslan Salakhudinov
Sparse Subspace Clustering with Missing Entries
Congyuan Yang, Daniel Robinson, Rene Vidal
Theory of Dual-sparse Regularized Randomized Reduction
Tianbao Yang, Lijun Zhang, Rong Jin, Shenghuo Zhu
Statistical and Algorithmic Perspectives on Randomized Sketching for Ordinary Least-Squares
Garvesh Raskutti, Michael Mahoney
The Power of Randomization: Distributed Submodular Maximization on Massive Datasets
Rafael Barbosa, Alina Ene, Huy Nguyen, Justin Ward
A Unified Framework for Outlier-Robust PCA-like Algorithms
Wenzhuo Yang, Huan Xu
Streaming Sparse Principal Component Analysis
Wenzhuo Yang, Huan Xu
A Stochastic PCA and SVD Algorithm with an Exponential Convergence Rate
Ohad Shamir
Pushing the Limits of Affine Rank Minimization by Adapting Probabilistic PCA
Bo Xin, David Wipf
A Unified Framework for Outlier-Robust PCA-like Algorithms
Wenzhuo Yang, Huan Xu
Stay on path: PCA along graph paths
Megasthenis Asteris, Anastasios Kyrillidis, Alex Dimakis, Han-Gyol Yi, Bharath Chandrasekaran
Deep Learning with Limited Numerical Precision
Suyog Gupta, Ankur Agrawal, Kailash Gopalakrishnan, Pritish Narayanan
Training Deep Convolutional Neural Networks to Play Go
Christopher Clark, Amos Storkey
 
Harmonic Exponential Families on Manifolds
Taco Cohen, Max Welling
 
Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP)
Andrew Wilson, Hannes Nickisch
Learning Deep Structured Models
Liang-Chieh Chen, Alexander Schwing, Alan Yuille, Raquel Urtasun
Scalable Deep Poisson Factor Analysis for Topic Modeling
Zhe Gan, Changyou Chen, Ricardo Henao, David Carlson, Lawrence Carin

Scalable Bayesian Optimization Using Deep Neural Networks
Jasper Snoek, Oren Rippel, Kevin Swersky, Ryan Kiros, Nadathur Satish, Narayanan Sundaram, Mostofa Patwary, Mr Prabhat, Ryan Adams
Deep Unsupervised Learning using Nonequilibrium Thermodynamics
Jascha Sohl-Dickstein, Eric Weiss, Niru Maheswaranathan, Surya Ganguli
A Deeper Look at Planning as Learning from Replay
Harm Vanseijen, Rich Sutton

 
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly