Text
Label-Noise Robust Deep Generative Model for Semi-Supervised Learning
Deep generative models have demonstrated an excellent ability to generate data by learning their distribution. Despite their unsupervised nature, these models can be implemented in semi-supervised learning scenarios by treating the class labels as additional latent variables. In this article, we propose a deep generative model for semi-supervised learning that offsets label noise, which is a ubiquitous feature in large-scale datasets owing to the high cost of annotation. We assume that noisy labels are generated from true labels and employ a noise transition matrix to describe the transition. We estimate this matrix by adjusting its entries to minimize its difference with the true transition matrix and use the estimated matrix to formulate the objective function for inference, which consists of an evidence lower bound and a classification risk. However, because directly minimizing the latter with noisy labels may result in an inaccurate classifier, we propose a statistically consistent estimator for computing the classification risk solely with noisy data. Empirical results on benchmark datasets demonstrate that the proposed model improves the classification performance over that of the baseline algorithms. We also present a case study on semiconductor manufacturing. Additionally, we empirically show that the proposed model, as a generative model, is capable of reconstructing data even with noisy labels.
Barcode | Tipe Koleksi | Nomor Panggil | Lokasi | Status | |
---|---|---|---|---|---|
art144902 | null | Artikel | Gdg9-Lt3 | Tersedia namun tidak untuk dipinjamkan - No Loan |
Tidak tersedia versi lain