Text
Signed Graph Metric Learning via Gershgorin Disc Perfect Alignment
Given a convex and differentiable objective Q(M) for a real symmetric matrix M in the positive definite (PD) cone—used to compute Mahalanobis distances—we propose a fast general metric learning framework that is entirely projection-free. We first assume that M resides in a space S of generalized graph Laplacian matrices corresponding to balanced signed graphs. M∈S that is also PD is called a graph metric matrix. Unlike low-rank metric matrices common in the literature, S includes the important diagonal-only matrices as a special case. The key theorem to circumvent full eigen-decomposition and enable fast metric matrix optimization is Gershgorin disc perfect alignment (GDPA): given M∈S and diagonal matrix S , where Sii=1/vi and v is the first eigenvector of M , we prove that Gershgorin disc left-ends of similarity transform B=SMS−1 are perfectly aligned at the smallest eigenvalue λmin . Using this theorem, we replace the PD cone constraint in the metric learning problem with tightest possible linear constraints per iteration, so that the alternating optimization of the diagonal / off-diagonal terms in M can be solved efficiently as linear programs via the Frank-Wolfe method. We update v using Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) with warm start as entries in M are optimized successively. Experiments show that our graph metric optimization is significantly faster than cone-projection schemes, and produces competitive binary classification performance.
Barcode | Tipe Koleksi | Nomor Panggil | Lokasi | Status | |
---|---|---|---|---|---|
art145268 | null | Artikel | Gdg9-Lt3 | Tersedia namun tidak untuk dipinjamkan - No Loan |
Tidak tersedia versi lain