You Don’t Need Many Labels to Learn

This title could be clearer and more informative.Try out Clickbait Shieldfor free (5 uses left this month).

A Gaussian Mixture Variational Autoencoder (GMVAE) can be trained entirely without labels and then converted into a classifier using as little as 0.2% labeled data — 35x less than XGBoost needs for comparable accuracy. The key insight is that unsupervised training already discovers the data's cluster structure; labels are only

10m read timeFrom towardsdatascience.com
Post cover image
Table of contents
IntroductionTurning Clusters Into a ClassifierHow Much Supervision Do We Need in Practice?Conclusion

Sort: