This book covers both classical and modern models in deep learning.
The primary focus is on the theory and algorithms of deep learning.
The theory and algorithms of neural networks are particularly
important for understanding important concepts, so that one can
understand the important design concepts of neural architectures in
different applications. Why do neural networks work? When do they work
better than off-the-shelf machine-learning models? When is depth
useful? Why is training neural networks so hard? What are the
pitfalls? The book is also rich in discussing different applications
in order to give the practitioner a flavor of how neural architectures
are designed for different types of problems. Deep learning methods
for various data domains, such as text, images, and graphs are
presented in detail. The chapters of this book span three
categories: The basics of neural networks: The backpropagation
algorithm is discussed in Chapter 2. Many traditional machine learning
models can be understood as special cases of neural networks. Chapter
3 explores the connections between traditional machine learning and
neural networks. Support vector machines, linear/logistic regression,
singular value decomposition, matrix factorization, and recommender
systems are shown to be special cases of neural networks.
Fundamentals of neural networks: A detailed discussion of training
and regularization is provided in Chapters 4 and 5. Chapters 6 and 7
present radial-basis function (RBF) networks and restricted Boltzmann
machines. Advanced topics in neural networks: Chapters 8, 9, and
10 discuss recurrent neural networks, convolutional neural networks,
and graph neural networks. Several advanced topics like deep
reinforcement learning, attention mechanisms, transformer networks,
Kohonen self-organizing maps, and generative adversarial networks are
introduced in Chapters 11 and 12. The textbook is written for
graduate students and upper under graduate level students. Researchers
and practitioners working within this related field will want to
purchase this as well. Where possible, an application-centric view is
highlighted in order to provide an understanding of the practical uses
of each class of techniques.The second edition is substantially
reorganized and expanded with separate chapters on backpropagation and
graph neural networks. Many chapters have been significantly revised
over the first edition. Greater focus is placed on modern deep
learning ideas such as attention mechanisms, transformers, and
pre-trained language models.
Les mer
A Textbook
Produktdetaljer
ISBN
9783031296420
Publisert
2023
Utgave
2. utgave
Utgiver
Springer Nature
Språk
Product language
Engelsk
Format
Product format
Digital bok
Forfatter