Abstract: We introduce a first-order method for solving semidefinite programming problems. This method has low computational complexity per iteration and is easy to implement. In each iteration, it alternates in two steps: gradient-descent to optimize the objective function, and random projection to reduce the infeasibility of the constraints. Due to its low computational complexity per iteration, it can be scaled to
large problems. We also prove the algorithm’s convergence and demonstrate its performance in numerical examples.
0 Replies
Loading