all AI news
Self-supervised learning improves robustness of deep learning lung tumor segmentation to CT imaging differences
May 15, 2024, 4:46 a.m. | Jue Jiang, Aneesh Rangnekar, Harini Veeraraghavan
cs.CV updates on arXiv.org arxiv.org
Abstract: Self-supervised learning (SSL) is an approach to extract useful feature representations from unlabeled data, and enable fine-tuning on downstream tasks with limited labeled examples. Self-pretraining is a SSL approach that uses the curated task dataset for both pretraining the networks and fine-tuning them. Availability of large, diverse, and uncurated public medical image sets provides the opportunity to apply SSL in the "wild" and potentially extract features robust to imaging variations. However, the benefit of wild- …
abstract arxiv cs.cv data dataset deep learning differences eess.iv examples extract feature fine-tuning imaging networks pretraining robustness segmentation self-supervised learning ssl supervised learning tasks type
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Technical Program Manager, Expert AI Trainer Acquisition & Engagement
@ OpenAI | San Francisco, CA
Director, Data Engineering
@ PatientPoint | Cincinnati, Ohio, United States