Yura Malitsky

Publications

[1] Y. Malitsky and M. K. Tam. Resolvent splitting for sums of monotone operators with minimal lifting. Math. Program., 201(1):231--262, 2023. [ DOI | arXiv ]
[2] A. Alacaoglu, A. Böhm, and Y. Malitsky. Beyond the golden ratio for variational inequality algorithms. J. Mach. Learn. Res., 24:1--33, 2023. [ arXiv | .html ]
[3] Z. Chen and Y. Malitsky. Over-the-air computation with multiple receivers: A space-time approach. IEEE Wireless Communications Letters, 12(8):1399--1403, 2023. [ DOI ]
[4] Z. Chen, E. G. Larsson, C. Fischione, M. Johansson, and Y. Malitsky. Over-the-air computation for distributed systems: Something old and something new. IEEE Network, 2023. [ DOI | arXiv ]
[5] Y. Malitsky and M. K. Tam. A first-order algorithm for decentralised min-max problems, 2023. [ arXiv ]
[6] Y. Malitsky and K. Mishchenko. Adaptive proximal gradient method for convex optimization, 2023. [ arXiv ]
[7] A. Alacaoglu and Y. Malitsky. Stochastic variance reduction for variational inequality methods. In Proceedings of Thirty Fifth Conference on Learning Theory, volume 178, pages 778--816. PMLR, 2022. [ arXiv | .html ]
[8] F. J. Aragón-Artacho, Y. Malitsky, M. K. Tam, and D. Torregrosa-Belén. Distributed forward-backward methods for ring networks. Computational optimization and applications, 2022. [ DOI | arXiv ]
[9] A. Alacaoglu, Y. Malitsky, and V. Cevher. Convergence of adaptive algorithms for weakly convex constrained optimization. In NeurIPS, volume 34, pages 14214--14225, 2021. [ arXiv | .html ]
[10] A. Alacaoglu, Y. Malitsky, and V. Cevher. Forward-reflected-backward method with variance reduction. Computational optimization and applications, 80(2):321--346, 2021. [ DOI ]
[11] M.-L. Vladarean, Y. Malitsky, and V. Cevher. A first-order primal-dual method with adaptivity to local smoothness. In NeurIPS, volume 34, pages 6171--6182, 2021. [ arXiv | .html ]
[12] Y. Malitsky and M. K. Tam. A forward-backward splitting method for monotone inclusions without cocoercivity. SIAM Journal on Optimization, 30(2):1451--1472, 2020. [ DOI | arXiv ]
[13] K. Mishchenko, D. Kovalev, E. Shulgin, P. Richtárik, and Y. Malitsky. Revisiting stochastic extragradient. In International Conference on Artificial Intelligence and Statistics, 2020. [ arXiv | .html ]
[14] Y. Malitsky. Golden ratio algorithms for variational inequalities. Mathematical Programming, 184:383--410, 2020. [ DOI | arXiv ]
[15] Y. Malitsky and K. Mishchenko. Adaptive gradient descent without descent. In Proceedings of the 37th International Conference on Machine Learning, volume 119, pages 6702----6712. PMLR, 2020. [ arXiv | .html ]
[16] A. Alacaoglu, Y. Malitsky, P. Mertikopoulos, and V. Cevher. A new regret analysis for adam-type algorithms. In Proceedings of the 37th International Conference on Machine Learning, volume 119, pages 202--210. PMLR, 2020. [ arXiv | .html ]
[17] Y. Malitsky and P. Ochs. Model function based conditional gradient method with Armijo-like line search. In Proceedings of the 36th International Conference on Machine Learning, pages 4891--4900, 2019. [ arXiv | .pdf ]
[18] E. R. Csetnek, Y. Malitsky, and M. K. Tam. Shadow Douglas-Rachford splitting for monotone inclusions. Applied Mathematics & Optimization, 80(3):665--678, 2019. [ DOI | arXiv ]
[19] Y. Malitsky. Proximal extrapolated gradient methods for variational inequalities. Optimization Methods and Software, 33(1):140--164, 2018. [ DOI | arXiv ]
[20] D. R. Luke and Y. Malitsky. Block-coordinate primal-dual method for nonsmooth minimization over linear constraints. In Large-Scale and Distributed Optimization, pages 121--147. Springer, Cham, 2018. [ DOI | arXiv ]
[21] Y. Malitsky and T. Pock. A first-order primal-dual algorithm with linesearch. SIAM Journal on Optimization, 28(1):411--432, 2018. [ DOI | arXiv ]
[22] Y. Malitsky. The primal-dual hybrid gradient method reduces to a primal method for linearly constrained optimization problems, 2017. [ arXiv ]
[23] Y. Malitsky. Projected reflected gradient methods for monotone variational inequalities. SIAM Journal on Optimization, 25(1):502--520, 2015. [ DOI | arXiv ]
SIAM Student Paper Award, 2015
[24] Y. V. Malitsky and V. Semenov. A hybrid method without extrapolation step for solving variational inequality problems. Journal of Global Optimization, 61(1):193--202, 2015. [ DOI | arXiv ]
[25] Y. V. Malitsky and V. Semenov. An extragradient algorithm for monotone variational inequalities. Cybernetics and Systems Analysis, 50(2):271--277, 2014. [ DOI ]