all AI news
Bayesian Learning-driven Prototypical Contrastive Loss for Class-Incremental Learning
May 21, 2024, 4:46 a.m. | Nisha L. Raichur, Lucas Heublein, Tobias Feigl, Alexander R\"ugamer, Christopher Mutschler, Felix Ott
cs.CV updates on arXiv.org arxiv.org
Abstract: The primary objective of methods in continual learning is to learn tasks in a sequential manner over time from a stream of data, while mitigating the detrimental phenomenon of catastrophic forgetting. In this paper, we focus on learning an optimal representation between previous class prototypes and newly encountered ones. We propose a prototypical network with a Bayesian learning-driven contrastive loss (BLCL) tailored specifically for class-incremental learning scenarios. Therefore, we introduce a contrastive loss that incorporates …
abstract arxiv bayesian catastrophic forgetting class continual cs.ai cs.cv data focus incremental incremental learning learn loss paper representation tasks type while
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Principal Autonomy Applications
@ BHP | Chile
Quant Analytics Associate - Data Visualization
@ JPMorgan Chase & Co. | Bengaluru, Karnataka, India