all AI news
$f$-Divergence Based Classification: Beyond the Use of Cross-Entropy
May 17, 2024, 4:43 a.m. | Nicola Novello, Andrea M. Tonello
cs.LG updates on arXiv.org arxiv.org
Abstract: In deep learning, classification tasks are formalized as optimization problems often solved via the minimization of the cross-entropy. However, recent advancements in the design of objective functions allow the usage of the $f$-divergence to generalize the formulation of the optimization problem for classification. We adopt a Bayesian perspective and formulate the classification task as a maximum a posteriori probability problem. We propose a class of objective functions based on the variational representation of the $f$-divergence. …
abstract arxiv bayesian beyond classification cross-entropy cs.lg deep learning design divergence eess.sp entropy functions however optimization replace tasks type usage via
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Technical Program Manager, Expert AI Trainer Acquisition & Engagement
@ OpenAI | San Francisco, CA
Director, Data Engineering
@ PatientPoint | Cincinnati, Ohio, United States