Abstract: This article considers constrained nonsmooth generalized convex and strongly convex optimization problems. For
such problems, two novel distributed smoothing projection neurodynamic approaches (DSPNAs) are proposed to seek their
optimal solutions with faster convergence rates in a distributed
manner. First, we equivalently transform the original constrained
optimal problem into a standard smoothing distributed problem
with only local set constraints based on an exact penalty and
smoothing approximation methods. Then, to deal with nonsmooth generally convex optimization, we propose a novel DSPNA
based on continuous variant of Nesterov’s acceleration (called
DSPNA-N), which has a faster convergence rate O(1/t
2), and
we design a novel DSPNA inspired by the continuous variant of
Polyak’s heavy ball method (called DSPNA-P) to address the
nonsmooth strongly convex optimal problem with an explicit
exponential convergent rate. In addition, the existence, uniqueness, and feasibility of the solution of our proposed DSPNAs
are also provided. Finally, numerical results demonstrate the
effectiveness of DSPNAs.
Loading