<p>"This is excellent material for readers who like mathematical approaches with a good dose of current market capabilities."<br /> <strong>Krzysztof Kamyczek</strong></p> <p>"This book presents a deep dive into regularization that does justice to an under-appreciated technique, and presents it in the context of a larger discussion about optimizing machine and deep learning models."<br /> <strong>Maureen Metzger</strong></p> <p>"This book tackles a well-known problem among dedicated AI community that is not necessarily advertised by modern data platform suppliers."<br /> <strong>Jesús Antonino Juárez Guerrero</strong></p>

Take your deep learning models more adaptable with these practical regularisation techniques.

For data scientists, machine learning engineers, and researchers with basic model development experience who want to improve their training efficiency and avoid overfitting errors.

Regularization in Deep Learning delivers practical techniques to help you build more general and adaptable deep learning models. It goes beyond basic techniques like data augmentation and explores strategies for architecture, objective function, and optimisation.

You will turn regularisation theory into practice using PyTorch, following guided implementations that you can easily adapt and customise to your own model's needs.

Key features include:

  • Insights into model generalisability
  • A holistic overview of regularisation techniques and strategies
  • Classical and modern views of generalisation, including bias and variance tradeoff
  • When and where to use different regularisation techniques
  • The background knowledge you need to understand cutting-edge research

Along the way, you will get just enough of the theory and mathematics behind regularisation to understand the new research emerging in this important area.

About the technology

Deep learning models that generate highly accurate results on their training data can struggle with messy real-world test datasets. Regularisation strategies help overcome these errors with techniques that help your models handle noisy data and changing requirements. By learning to tweak training data and loss functions, and employ other regularisation approaches, you can ensure a model delivers excellent generalised performance and avoid overfitting errors.

Les mer

Regularization in Deep Learning delivers practical techniques to help you build more adaptable deep learning model. Along with exploring strategies for architecture, objective function, and optimisation, you will get just enough of the theory and mathematics behind regularisation to understand the new research emerging in this important area.

Les mer
About the book

Regularization in Deep Learning teaches you how to improve your model performance with a toolbox of regularisation techniques. It covers both well-established regularisation methods and ground-breaking modern approaches. Each technique is introduced using graphics, illustrations, and step-by-step coding walkthroughs that make complex maths easy to follow. You will learn how to augment your dataset with random noise, improve your model's architecture, and apply regularisation in your optimisation procedures. You will soon be building focused deep learning models that avoid sprawling complexity and deliver more accurate results even with new or messy data sets.

About the reader

For data scientists, machine learning engineers, and researchers with basic model development experience.

Les mer

Produktdetaljer

ISBN
9781633439610
Publisert
2023-10-02
Utgiver
Vendor
Manning Publications
Aldersnivå
P, 06
Språk
Product language
Engelsk
Format
Product format
Heftet
Antall sider
275

Forfatter

Biografisk notat

Peng Liu is an experienced data scientist focusing on applied research and development of high-performance machine learning models in production. He holds a Ph.D. in Statistics from the National University of Singapore, and teaches advanced analytics courses as an adjunct lecturer in universities. He specialises in the statistical aspects of deep learning.