Show simple item record

dc.contributor.advisorDavies, Michael
dc.contributor.advisorEscudero Rodriguez, Javier
dc.contributor.authorTang, Junqi
dc.date.accessioned2019-09-17T09:01:51Z
dc.date.available2019-09-17T09:01:51Z
dc.date.issued2019-11-28
dc.identifier.urihttp://hdl.handle.net/1842/36141
dc.description.abstractThis thesis advances the state-of-the-art of randomized optimization algorithms, to efficiently solve the large-scale composite optimization problems which appear increasingly more frequent in modern statistical machine learning and signal processing applications in this big-data era. It contributes from a special point of view, that the low-dimensional structure of the composite optimization problem’s solution (such as sparsity, group-sparsity, piece-wise smoothness, or low-rank structure, etc), can be actively exploited by some purposefully tailored optimization algorithms to achieve even faster convergence rates – namely, the structure-adaptive algorithms. Driven by this motivation, several randomized optimization algorithms are designed and analyzed in this thesis. The proposed methods are provably equipped with the desirable structure-adaptive property, including the sketched gradient descent algorithms, the structure-adaptive variants of accelerated stochastic variance-reduced gradient descent and randomized coordinate descent algorithms. The thesis provides successful and inspiring paradigms for the algorithmic design of randomized structure-adaptive methods, confirming that the low-dimensional structure is indeed a promising “hidden treasure” to be exploited for accelerating large-scale optimization.en
dc.language.isoenen
dc.publisherThe University of Edinburghen
dc.relation.hasversionJunqi Tang, Mohammad Golbabaee, Mike Davies. “Gradient Projection Iterative Sketch for Large-Scale Constrained Least-Squares”, in Proc. of 34th International Conference on Machine Learning (ICML), 2017.en
dc.relation.hasversionJunqi Tang, Mohammad Golbabaee, Mike Davies. “ Exploiting the Structure via Sketched Gradient Algorithms”, in Proc. of 5th IEEE Global Conference on Signal and Information Processing (GlobalSIP), 2017.en
dc.relation.hasversionJunqi Tang, Mohammad Golbabaee, Francis Bach, Mike Davies. “Rest-Katyusha: Exploiting the Solution’s Structure via Scheduled Restart Schemes”, in Advances in Neural Information Processing Systems (NeurIPS), 2018en
dc.relation.hasversionJunqi Tang, Karen Egiazarian, Mike Davies. “The Limitation and Practical Acceleration of Stochastic Gradient Algorithms in Inverse Problems”, in Proc. of 44th International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2019.en
dc.relation.hasversionJ. Tang, K. Egiazarian, and M. Davies, “The limitation and practical acceleration of stochastic gradient algorithms in imaging inverse problems,” to appear in ICASSP, 2019.en
dc.relation.hasversionJ. Tang, M. Golbabaee, F. Bach, and M. Davies, “Structure-adaptive accelerated coordinate descent,” Hal-archive hal-01889990v2, 2018en
dc.relation.hasversionJ. Tang, F. Bach, M. Golbabaee, and M. Davies, “Structure-adaptive, variance-reduced, and accelerated stochastic optimization,” arXiv preprint arXiv:1712.03156, 2017.en
dc.subjectoptimization algorithmsen
dc.subjectrandomization techniquesen
dc.subjectrandomized optimization algorithmsen
dc.subjectpiece-wise smoothnessen
dc.titleRandomized structure-adaptive optimizationen
dc.typeThesis or Dissertationen
dc.type.qualificationlevelDoctoralen
dc.type.qualificationnamePhD Doctor of Philosophyen


Files in this item

This item appears in the following Collection(s)

Show simple item record