Neural networks and deep learning : a textbook / Charu C. Aggarwal.
Material type: TextLanguage: English Publication details: Cham : Springer, 2023.Edition: 2nd edISBN:- 9783031296413
Item type | Home library | Collection | Call number | Status | Date due | Barcode | |
---|---|---|---|---|---|---|---|
OPJGU Sonepat- Campus Central Library | Course Reserve | 006.32 AG-N (Browse shelf(Opens below)) | Not For Loan | 151309 | |||
OPJGU Sonepat- Campus Main Library | Textbooks | 006.32 AG-N (Browse shelf(Opens below)) | Available | 151310 | |||
OPJGU Sonepat- Campus Main Library | Textbooks | 006.32 AG-N (Browse shelf(Opens below)) | Available | 151311 |
"This book covers both classical and modern models in deep learning. The primary focus is on the theory and algorithms of deep learning. The theory and algorithms of neural networks are particularly important for understanding important concepts, so that one can understand the important design concepts of neural architectures in different applications. Why do neural networks work? When do they work better than off-the-shelf machine-learning models? When is depth useful? Why is training neural networks so hard? What are the pitfalls? The book is also rich in discussing different applications in order to give the practitioner a flavor of how neural architectures are designed for different types of problems. Deep learning methods for various data domains, such as text, images, and graphs are presented in detail. The textbook is written for graduate students and upper under graduate level students. Researchers and practitioners working within this related field will want to purchase this as well. Where possible, an application-centric view is highlighted in order to provide an understanding of the practical uses of each class of techniques. The second edition is substantially reorganized and expanded with separate chapters on backpropagation and graph neural networks. Many chapters have been significantly revised over the first edition. Greater focus is placed on modern deep learning ideas such as attention mechanisms, transformers, and pre-trained language models.
There are no comments on this title.