Presentation on International Workshop on Continuous Optimization
Date:
A proximal gradient method with Bregman distance in multi-objective optimization
Kangming Chen(coauthor:Ellen Hidemi Fukuda,Nobuo Yamashita)
Abstract: Recently, a multi-objective proximal gradient method was proposed, which is a suitable descent method for composite multi-objective optimization problems. However, the method solves subproblems using only Euclidean distances, and it requires the differentiable part of each objective to have a Lipschitz continuous gradient, which limits its application. We propose an extension of this method, by using Bregman distances, and requiring a less demanding assumption called relative smoothness. We also consider two stepsize strategies: the constant stepsize and the backtracking procedure. In both cases, we prove global convergence in the sense of Pareto stationarity, and analyze the convergence rate through some merit functions.