Reduced Jacobian Method

El Maghri, M; Elboulqe, Y

El Maghri, M (reprint author), Hassan II Univ, Dept Math & Comp, Fac Sci Ain Chock, BP 5366, Casablanca, Morocco.

JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2018; 179 (3): 917

Abstract

In this paper, we present the Wolfe's reduced gradient method for multiobjective (multicriteria) optimization. We precisely deal with the problem of minimizing nonlinear objectives under linear constraints and propose a reduced Jacobian method, namely a reduced gradient-like method that does not scalarize those programs. As long as there are nondominated solutions, the principle is to determine a direction that decreases all goals at the same time to achieve one of them. Following the reduction strategy, only a reduced search direction is to be found. We show that this latter can be obtained by solving a simple differentiable and convex program at each iteration. Moreover, this method is conceived to recover both the discontinuous and continuous schemes of Wolfe for the single-objective programs. The resulting algorithm is proved to be (globally) convergent to a Pareto KKT-stationary (Pareto critical) point under classical hypotheses and a multiobjective Armijo line search condition. Finally, experiment results over test problems show a net performance of the proposed algorithm and its superiority against a classical scalarization approach, both in the quality of the approximated Pareto front and in the computational effort.

Download PDF


Full Text Link