10% off all books and free delivery over £40
Buy from our bookstore and 25% of the cover price will be given to a school of your choice to buy more books. *15% of eBooks.

Deep Neural Networks in a Mathematical Framework

View All Editions

The selected edition of this book is not available to buy right now.
Add To Wishlist
Write A Review

About

Deep Neural Networks in a Mathematical Framework Synopsis

This SpringerBrief describes how to build a rigorous end-to-end mathematical framework for deep neural networks. The authors provide tools to represent and describe neural networks, casting previous results in the field in a more natural light. In particular, the authors derive gradient descent algorithms in a unified way for several neural network structures, including multilayer perceptrons, convolutional neural networks, deep autoencoders and recurrent neural networks. Furthermore, the authors developed framework is both more concise and mathematically intuitive than previous representations of neural networks. This SpringerBrief is one step towards unlocking the black box of Deep Learning. The authors believe that this framework will help catalyze further discoveries regarding the mathematical properties of neural networks.This SpringerBrief is accessible not only to researchers, professionals and students working and studying in the field of deep learning, but alsoto those outside of the neutral network community.

About This Edition

ISBN: 9783319753034
Publication date: 3rd April 2018
Author: Anthony L. Caterini, Dong Eui Chang
Publisher: Springer International Publishing AG
Format: Paperback
Pagination: 84 pages
Series: SpringerBriefs in Computer Science
Genres: Neural networks and fuzzy systems
Pattern recognition