Page not found. Your pixels are in another canvas.
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
This is a page not in th emain menu
My wife and I are finally at KAUST. We have to stay in quarantine until January 10.
Short description of portfolio item number 1
Short description of portfolio item number 2
Published in The Thirty-seventh International Conference on Machine Learning (ICML 2020), 2020
Grigory Malinovsky, Dmitry Kovalev, Elnur Gasanov, Laurent Condat, Peter Richtárik
Published in The 12th Annual Workshop on Optimization for Machine Learning (NeurIPS Workshop OPT2020), Spotlight, 2020
Laurent Condat, Grigory Malinovsky, Peter Richtárik
Published in , 2021
Grigory Malinovsky, Alibek Sailanbayev, Peter Richtárik
It is proposed a new approach to the classic Heavy ball method called the Averaged Heavy ball method for unconstrained optimization. This method reduces the peak effect, thereby providing a better convergence rate in the initial iterations. Best poster Award.
It is proposed a new approach to the classic Heavy ball method called the Averaged Heavy ball method for unconstrained optimization. This method reduces the peak effect, thereby providing a better convergence rate in the initial iterations. It is provided analysis for the quadratic case and conduct numerical experiments justifying the theory for a wider class of problems. Best talk Award.
The properties of the training dataset in classification and regression problems are investigated. We propose a method for determining the complexity of the training data using a two-layer fully connected neural network. The number of neurons in the inner layer of the neural network is used to determine the complexity. Book of abstracts (page 68).
Tutoring, Physics Olympiads and Tests, 2019
Preparation of 7-11th grade students for the Olympiads and tests, 2017-2019
Undergraduate course, Teaching Assistance, MIPT, Spring term, 2019
This course aims to introduce students to modern state of Machine Learning and Artificial Intelligence. It is designed to take one year (two terms at MIPT) - approximately 2 * 15 lectures and seminars.
Undergraduate course, Teaching Assistance, MIPT, Fall term, 2019
Theory: Convex Sets and Functions, Optimality Conditions, Foundations of duality theory
Practice: Optimization Problem Statement, Methods for solving problems without restrictions, Methods for solving problems with simple constraints Linear programming, Cone Optimization Problems and SDP