Deep learning excels at extracting complex patterns but faces catastrophic forgetting when fine-tuned on new data. This book investigates how class- and domain-incremental learning affect neural networks for automated driving, identifying semantic shifts and feature changes as key factors. Tools for quantitatively measuring forgetting are selected and used to show how strategies like image augmentation, pretraining, and architectural adaptations mitigate catastrophic forgetting.
DETAILS
Principles of Catastrophic Forgetting for Continual Semantic Segmentation in Automated Driving
Kalb, Tobias Michael
Kartoniert, 236 S.
graph. Darst.
Sprache: Englisch
210 mm
KIT Scientific Publishing (2024)
Gewicht: 450 g
ISBN-13: 978-3-7315-1373-5
Titelnr.: 97721333