0% found this document useful (0 votes)
1 views

Deep Learning Toolbox Users Guide Unknown pdf download

The document is a user's guide for the Deep Learning Toolbox by The MathWorks, detailing various aspects of deep learning in MATLAB. It includes information on deep learning workflows, pretrained networks, convolutional neural networks, and training parameters. The guide also provides links to additional resources and related products for users interested in deep learning applications.

Uploaded by

rhyceweegebp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views

Deep Learning Toolbox Users Guide Unknown pdf download

The document is a user's guide for the Deep Learning Toolbox by The MathWorks, detailing various aspects of deep learning in MATLAB. It includes information on deep learning workflows, pretrained networks, convolutional neural networks, and training parameters. The guide also provides links to additional resources and related products for users interested in deep learning applications.

Uploaded by

rhyceweegebp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 78

Deep Learning Toolbox Users Guide Unknown

download

https://ebookbell.com/product/deep-learning-toolbox-users-guide-
unknown-49158114

Explore and download more ebooks at ebookbell.com


Here are some recommended products that we believe you will be
interested in. You can click the link to download.

Matlab R2022b Deep Learning Toolbox Users Guide Mark Hudson

https://ebookbell.com/product/matlab-r2022b-deep-learning-toolbox-
users-guide-mark-hudson-46240934

Matlab R2023a Deep Learning Toolbox Users Guide Mark Hudson Beale

https://ebookbell.com/product/matlab-r2023a-deep-learning-toolbox-
users-guide-mark-hudson-beale-49478930

Matlab R2023a Deep Learning Hdl Toolbox Users Guide Mathworks

https://ebookbell.com/product/matlab-r2023a-deep-learning-hdl-toolbox-
users-guide-mathworks-49478982

Deep Learning Toolbox Getting Started Guide Matlab 143 The Mathworks

https://ebookbell.com/product/deep-learning-toolbox-getting-started-
guide-matlab-143-the-mathworks-43710832
Matlab R2022b Deep Learning Toolbox Getting Started Guide Mark Hudson
Beale Martin T Hagan Howard B Demuth

https://ebookbell.com/product/matlab-r2022b-deep-learning-toolbox-
getting-started-guide-mark-hudson-beale-martin-t-hagan-howard-b-
demuth-46231442

Matlab R2023a Deep Learning Toolbox Reference Mark Hudson Beale

https://ebookbell.com/product/matlab-r2023a-deep-learning-toolbox-
reference-mark-hudson-beale-49478934

Matlab Deep Learning Toolbox Reference Mark Hudson Beale Martin T


Hagan

https://ebookbell.com/product/matlab-deep-learning-toolbox-reference-
mark-hudson-beale-martin-t-hagan-52253068

Matlab Deep Learning Hdl Toolbox Ug The Mathworks Inc

https://ebookbell.com/product/matlab-deep-learning-hdl-toolbox-ug-the-
mathworks-inc-36088316

Deep Learning Ian Goodfellow Yoshua Bengio Aaron Courville

https://ebookbell.com/product/deep-learning-ian-goodfellow-yoshua-
bengio-aaron-courville-44886094
Deep Learning Toolbox™
User's Guide

Mark Hudson Beale


Martin T. Hagan
Howard B. Demuth

R2022b
How to Contact MathWorks

Latest news: www.mathworks.com

Sales and services: www.mathworks.com/sales_and_services

User community: www.mathworks.com/matlabcentral

Technical support: www.mathworks.com/support/contact_us

Phone: 508-647-7000

The MathWorks, Inc.


1 Apple Hill Drive
Natick, MA 01760-2098
Deep Learning Toolbox™ User's Guide
© COPYRIGHT 1992–2022 by The MathWorks, Inc.
The software described in this document is furnished under a license agreement. The software may be used or copied
only under the terms of the license agreement. No part of this manual may be photocopied or reproduced in any form
without prior written consent from The MathWorks, Inc.
FEDERAL ACQUISITION: This provision applies to all acquisitions of the Program and Documentation by, for, or through
the federal government of the United States. By accepting delivery of the Program or Documentation, the government
hereby agrees that this software or documentation qualifies as commercial computer software or commercial computer
software documentation as such terms are used or defined in FAR 12.212, DFARS Part 227.72, and DFARS 252.227-7014.
Accordingly, the terms and conditions of this Agreement and only those rights specified in this Agreement, shall pertain
to and govern the use, modification, reproduction, release, performance, display, and disclosure of the Program and
Documentation by the federal government (or other entity acquiring for or through the federal government) and shall
supersede any conflicting contractual terms or conditions. If this License fails to meet the government's needs or is
inconsistent in any respect with federal procurement law, the government agrees to return the Program and
Documentation, unused, to The MathWorks, Inc.
Trademarks
MATLAB and Simulink are registered trademarks of The MathWorks, Inc. See
www.mathworks.com/trademarks for a list of additional trademarks. Other product or brand names may be
trademarks or registered trademarks of their respective holders.
Patents
MathWorks products are protected by one or more U.S. patents. Please see www.mathworks.com/patents for
more information.
Revision History
June 1992 First printing
April 1993 Second printing
January 1997 Third printing
July 1997 Fourth printing
January 1998 Fifth printing Revised for Version 3 (Release 11)
September 2000 Sixth printing Revised for Version 4 (Release 12)
June 2001 Seventh printing Minor revisions (Release 12.1)
July 2002 Online only Minor revisions (Release 13)
January 2003 Online only Minor revisions (Release 13SP1)
June 2004 Online only Revised for Version 4.0.3 (Release 14)
October 2004 Online only Revised for Version 4.0.4 (Release 14SP1)
October 2004 Eighth printing Revised for Version 4.0.4
March 2005 Online only Revised for Version 4.0.5 (Release 14SP2)
March 2006 Online only Revised for Version 5.0 (Release 2006a)
September 2006 Ninth printing Minor revisions (Release 2006b)
March 2007 Online only Minor revisions (Release 2007a)
September 2007 Online only Revised for Version 5.1 (Release 2007b)
March 2008 Online only Revised for Version 6.0 (Release 2008a)
October 2008 Online only Revised for Version 6.0.1 (Release 2008b)
March 2009 Online only Revised for Version 6.0.2 (Release 2009a)
September 2009 Online only Revised for Version 6.0.3 (Release 2009b)
March 2010 Online only Revised for Version 6.0.4 (Release 2010a)
September 2010 Online only Revised for Version 7.0 (Release 2010b)
April 2011 Online only Revised for Version 7.0.1 (Release 2011a)
September 2011 Online only Revised for Version 7.0.2 (Release 2011b)
March 2012 Online only Revised for Version 7.0.3 (Release 2012a)
September 2012 Online only Revised for Version 8.0 (Release 2012b)
March 2013 Online only Revised for Version 8.0.1 (Release 2013a)
September 2013 Online only Revised for Version 8.1 (Release 2013b)
March 2014 Online only Revised for Version 8.2 (Release 2014a)
October 2014 Online only Revised for Version 8.2.1 (Release 2014b)
March 2015 Online only Revised for Version 8.3 (Release 2015a)
September 2015 Online only Revised for Version 8.4 (Release 2015b)
March 2016 Online only Revised for Version 9.0 (Release 2016a)
September 2016 Online only Revised for Version 9.1 (Release 2016b)
March 2017 Online only Revised for Version 10.0 (Release 2017a)
September 2017 Online only Revised for Version 11.0 (Release 2017b)
March 2018 Online only Revised for Version 11.1 (Release 2018a)
September 2018 Online only Revised for Version 12.0 (Release 2018b)
March 2019 Online only Revised for Version 12.1 (Release 2019a)
September 2019 Online only Revised for Version 13 (Release 2019b)
March 2020 Online only Revised for Version 14 (Release 2020a)
September 2020 Online only Revised for Version 14.1 (Release 2020b)
March 2021 Online only Revised for Version 14.2 (Release 2021a)
September 2021 Online only Revised for Version 14.3 (Release 2021b)
March 2022 Online only Revised for Version 14.4 (Release 2022a)
September 2022 Online only Revised for Version 14.5 (Release 2022b)
Contents

Deep Networks
1
Deep Learning in MATLAB . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-2
What Is Deep Learning? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-2
Start Deep Learning Faster Using Transfer Learning . . . . . . . . . . . . . . . . 1-2
Deep Learning Workflows . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-3
Deep Learning Apps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-5
Train Classifiers Using Features Extracted from Pretrained Networks . . . 1-7
Deep Learning with Big Data on CPUs, GPUs, in Parallel, and on the Cloud
...................................................... 1-7
Deep Learning Using Simulink . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-7
Deep Learning Interpretability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-8
Deep Learning Customization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-8
Deep Learning Import and Export . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-9

Pretrained Deep Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-11


Compare Pretrained Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-12
Load Pretrained Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-13
Visualize Pretrained Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-14
Feature Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-16
Transfer Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-17
Import and Export Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-17
Pretrained Networks for Audio Applications . . . . . . . . . . . . . . . . . . . . . . 1-18
Pretrained Models on GitHub . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-19

Learn About Convolutional Neural Networks . . . . . . . . . . . . . . . . . . . . . . 1-21

Example Deep Learning Networks Architectures . . . . . . . . . . . . . . . . . . . 1-23

Multiple-Input and Multiple-Output Networks . . . . . . . . . . . . . . . . . . . . . 1-41


Multiple-Input Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-41
Multiple-Output Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-41

List of Deep Learning Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-43


Deep Learning Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-43

Specify Layers of Convolutional Neural Network . . . . . . . . . . . . . . . . . . . 1-53


Image Input Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-54
Convolutional Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-54
Batch Normalization Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-58
ReLU Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-58
Cross Channel Normalization (Local Response Normalization) Layer . . . 1-59
Max and Average Pooling Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-59
Dropout Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-60
Fully Connected Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-60
Output Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-61

v
Set Up Parameters and Train Convolutional Neural Network . . . . . . . . . 1-64
Specify Solver and Maximum Number of Epochs . . . . . . . . . . . . . . . . . . 1-64
Specify and Modify Learning Rate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-64
Specify Validation Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-65
Select Hardware Resource . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-65
Save Checkpoint Networks and Resume Training . . . . . . . . . . . . . . . . . . 1-66
Set Up Parameters in Convolutional and Fully Connected Layers . . . . . . 1-66
Train Your Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-66

Train Network with Numeric Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-68

Train Network on Image and Feature Data . . . . . . . . . . . . . . . . . . . . . . . . 1-74

Compare Activation Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-81

Deep Learning Tips and Tricks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-87


Choose Network Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-87
Choose Training Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-89
Improve Training Accuracy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-90
Fix Errors in Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-91
Prepare and Preprocess Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-92
Use Available Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-94
Fix Errors With Loading from MAT-Files . . . . . . . . . . . . . . . . . . . . . . . . . 1-95

Long Short-Term Memory Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-97


LSTM Network Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-97
Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-100
Classification, Prediction, and Forecasting . . . . . . . . . . . . . . . . . . . . . . 1-101
Sequence Padding, Truncation, and Splitting . . . . . . . . . . . . . . . . . . . . 1-101
Normalize Sequence Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-105
Out-of-Memory Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-105
Visualization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-105
LSTM Layer Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-105

Deep Network Designer


2
Transfer Learning with Deep Network Designer . . . . . . . . . . . . . . . . . . . . . 2-2

Build Networks with Deep Network Designer . . . . . . . . . . . . . . . . . . . . . . 2-14


Transfer Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-15
Image Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-17
Sequence Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-19
Numeric Data Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-20
Convert Classification Network into Regression Network . . . . . . . . . . . . 2-22
Multiple-Input and Multiple-Output Networks . . . . . . . . . . . . . . . . . . . . . 2-22
Deep Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-24
Advanced Deep Learning Applications . . . . . . . . . . . . . . . . . . . . . . . . . . 2-25
dlnetwork for Custom Training Loops . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-27
Check Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-28

vi Contents
Train Networks Using Deep Network Designer . . . . . . . . . . . . . . . . . . . . . 2-30
Select Training Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-30
Train Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-32
Next Steps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-33

Import Custom Layer into Deep Network Designer . . . . . . . . . . . . . . . . . 2-34

Import Data into Deep Network Designer . . . . . . . . . . . . . . . . . . . . . . . . . 2-38


Import Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-38
Image Augmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-48
Validation Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-49

Create Simple Sequence Classification Network Using Deep Network


Designer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-51

Train Network for Time Series Forecasting Using Deep Network Designer
......................................................... 2-58

Generate MATLAB Code from Deep Network Designer . . . . . . . . . . . . . . 2-70


Generate MATLAB Code to Recreate Network Layers . . . . . . . . . . . . . . . 2-70
Generate MATLAB Code to Train Network . . . . . . . . . . . . . . . . . . . . . . . 2-70

Image-to-Image Regression in Deep Network Designer . . . . . . . . . . . . . 2-73

Generate Experiment Using Deep Network Designer . . . . . . . . . . . . . . . . 2-80

Transfer Learning with Pretrained Audio Networks in Deep Network


Designer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-87

Export Image Classification Network from Deep Network Designer to


Simulink . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-96

Deep Learning with Images


3
Classify Webcam Images Using Deep Learning . . . . . . . . . . . . . . . . . . . . . 3-2

Train Deep Learning Network to Classify New Images . . . . . . . . . . . . . . . 3-6

Train Residual Network for Image Classification . . . . . . . . . . . . . . . . . . . 3-13

Classify Image Using GoogLeNet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-19

Extract Image Features Using Pretrained Network . . . . . . . . . . . . . . . . . 3-24

Transfer Learning Using Pretrained Network . . . . . . . . . . . . . . . . . . . . . . 3-29

Transfer Learning Using AlexNet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-36

Create Simple Deep Learning Network for Classification . . . . . . . . . . . . 3-43

vii
Train Convolutional Neural Network for Regression . . . . . . . . . . . . . . . . 3-49

Train Network with Multiple Outputs . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-57

Convert Classification Network into Regression Network . . . . . . . . . . . . 3-66

Train Generative Adversarial Network (GAN) . . . . . . . . . . . . . . . . . . . . . . 3-72

Train Conditional Generative Adversarial Network (CGAN) . . . . . . . . . . 3-85

Train Wasserstein GAN with Gradient Penalty (WGAN-GP) . . . . . . . . . . . 3-98

Train Fast Style Transfer Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-110

Train a Siamese Network to Compare Images . . . . . . . . . . . . . . . . . . . . 3-124

Train a Siamese Network for Dimensionality Reduction . . . . . . . . . . . . 3-138

Train Neural ODE Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-151

Train Variational Autoencoder (VAE) to Generate Images . . . . . . . . . . . 3-162

Lane and Vehicle Detection in Simulink Using Deep Learning . . . . . . . 3-172

Classify ECG Signals in Simulink Using Deep Learning . . . . . . . . . . . . 3-178

Classify Images in Simulink Using GoogLeNet . . . . . . . . . . . . . . . . . . . . 3-182

Multilabel Image Classification Using Deep Learning . . . . . . . . . . . . . . 3-187

Acceleration for Simulink Deep Learning Models . . . . . . . . . . . . . . . . . 3-202


Run Acceleration Mode from the User Interface . . . . . . . . . . . . . . . . . . 3-202
Run Acceleration Mode Programmatically . . . . . . . . . . . . . . . . . . . . . . 3-203

Deep Learning with Time Series, Sequences, and Text


4
Sequence Classification Using Deep Learning . . . . . . . . . . . . . . . . . . . . . . 4-3

Sequence Classification Using 1-D Convolutions . . . . . . . . . . . . . . . . . . . 4-10

Time Series Forecasting Using Deep Learning . . . . . . . . . . . . . . . . . . . . . 4-16

Train Speech Command Recognition Model Using Deep Learning . . . . . 4-27

Sequence-to-Sequence Classification Using Deep Learning . . . . . . . . . . 4-39

Sequence-to-Sequence Regression Using Deep Learning . . . . . . . . . . . . 4-44

Sequence-to-One Regression Using Deep Learning . . . . . . . . . . . . . . . . 4-53

viii Contents
Train Network with Complex-Valued Data . . . . . . . . . . . . . . . . . . . . . . . . . 4-60

Train Network with LSTM Projected Layer . . . . . . . . . . . . . . . . . . . . . . . . 4-68

Predict Battery State of Charge Using Deep Learning . . . . . . . . . . . . . . . 4-77

Classify Videos Using Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-86

Classify Videos Using Deep Learning with Custom Training Loop . . . . . 4-96

Train Sequence Classification Network Using Data With Imbalanced


Classes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-111

Sequence-to-Sequence Classification Using 1-D Convolutions . . . . . . . 4-121

Time Series Anomaly Detection Using Deep Learning . . . . . . . . . . . . . . 4-131

Sequence Classification Using CNN-LSTM Network . . . . . . . . . . . . . . . 4-142

Train Latent ODE Network with Irregularly Sampled Time-Series Data


........................................................ 4-155

Multivariate Time Series Anomaly Detection Using Graph Neural Network


........................................................ 4-175

Classify Text Data Using Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . 4-193

Classify Text Data Using Convolutional Neural Network . . . . . . . . . . . . 4-201

Multilabel Text Classification Using Deep Learning . . . . . . . . . . . . . . . . 4-208

Classify Text Data Using Custom Training Loop . . . . . . . . . . . . . . . . . . . 4-227

Generate Text Using Autoencoders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-238

Define Text Encoder Model Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-250

Define Text Decoder Model Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-257

Sequence-to-Sequence Translation Using Attention . . . . . . . . . . . . . . . 4-264

Generate Text Using Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-278

Pride and Prejudice and MATLAB . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-284

Word-By-Word Text Generation Using Deep Learning . . . . . . . . . . . . . . 4-290

Image Captioning Using Attention . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-297

Language Translation Using Deep Learning . . . . . . . . . . . . . . . . . . . . . . 4-321

Predict and Update Network State in Simulink . . . . . . . . . . . . . . . . . . . 4-343

Classify and Update Network State in Simulink . . . . . . . . . . . . . . . . . . . 4-347

ix
Time Series Prediction in Simulink Using Deep Learning Network . . . 4-351

Battery State of Charge Estimation in Simulink Using Deep Learning


Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-356

Improve Performance of Deep Learning Simulations in Simulink . . . . 4-359

Physical System Modeling Using LSTM Network in Simulink . . . . . . . . 4-363

Deep Learning Tuning and Visualization


5
Explore Network Predictions Using Deep Learning Visualization
Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-2

Deep Dream Images Using GoogLeNet . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-15

Grad-CAM Reveals the Why Behind Deep Learning Decisions . . . . . . . . 5-21

Interpret Deep Learning Time-Series Classifications Using Grad-CAM


......................................................... 5-24

Understand Network Predictions Using Occlusion . . . . . . . . . . . . . . . . . . 5-38

Investigate Classification Decisions Using Gradient Attribution


Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-45

Understand Network Predictions Using LIME . . . . . . . . . . . . . . . . . . . . . 5-56

Investigate Spectrogram Classifications Using LIME . . . . . . . . . . . . . . . 5-63

Interpret Deep Network Predictions on Tabular Data Using LIME . . . . . 5-73

Explore Semantic Segmentation Network Using Grad-CAM . . . . . . . . . . 5-80

Investigate Audio Classifications Using Deep Learning Interpretability


Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-87

Generate Untargeted and Targeted Adversarial Examples for Image


Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-101

Train Image Classification Network Robust to Adversarial Examples . 5-108

Generate Adversarial Examples for Semantic Segmentation . . . . . . . . 5-120

Resume Training from Checkpoint Network . . . . . . . . . . . . . . . . . . . . . . 5-131

Deep Learning Using Bayesian Optimization . . . . . . . . . . . . . . . . . . . . . 5-136

Train Deep Learning Networks in Parallel . . . . . . . . . . . . . . . . . . . . . . . 5-146

x Contents
Monitor Deep Learning Training Progress . . . . . . . . . . . . . . . . . . . . . . . 5-151

Customize Output During Deep Learning Network Training . . . . . . . . . 5-155

Detect Vanishing Gradients in Deep Neural Networks by Plotting


Gradient Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-159

Investigate Network Predictions Using Class Activation Mapping . . . . 5-170

View Network Behavior Using tsne . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-176

Visualize Activations of a Convolutional Neural Network . . . . . . . . . . . 5-188

Visualize Activations of LSTM Network . . . . . . . . . . . . . . . . . . . . . . . . . . 5-199

Visualize Features of a Convolutional Neural Network . . . . . . . . . . . . . 5-203

Visualize Image Classifications Using Maximal and Minimal Activating


Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-210

Monitor GAN Training Progress and Identify Common Failure Modes 5-229
Convergence Failure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-229
Mode Collapse . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-231

Deep Learning Visualization Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-233


Visualization Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-233
Interpretability Methods for Nonimage Data . . . . . . . . . . . . . . . . . . . . . 5-238

ROC Curve and Performance Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-240


Introduction to ROC Curve . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-240
Performance Curve with MATLAB . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-241
ROC Curve for Multiclass Classification . . . . . . . . . . . . . . . . . . . . . . . . 5-241
Performance Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-243
Classification Scores and Thresholds . . . . . . . . . . . . . . . . . . . . . . . . . . 5-245
Pointwise Confidence Intervals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-249

Compare Deep Learning Models Using ROC Curves . . . . . . . . . . . . . . . 5-251

Manage Deep Learning Experiments


6
Create a Deep Learning Experiment for Classification . . . . . . . . . . . . . . . 6-2

Create a Deep Learning Experiment for Regression . . . . . . . . . . . . . . . . 6-10

Use Experiment Manager to Train Networks in Parallel . . . . . . . . . . . . . 6-18


Set Up Parallel Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-19

Evaluate Deep Learning Experiments by Using Metric Functions . . . . . 6-21

Tune Experiment Hyperparameters by Using Bayesian Optimization . . 6-29

xi
Try Multiple Pretrained Networks for Transfer Learning . . . . . . . . . . . . 6-40

Experiment with Weight Initializers for Transfer Learning . . . . . . . . . . . 6-48

Choose Training Configurations for LSTM Using Bayesian Optimization


......................................................... 6-55

Run a Custom Training Experiment for Image Comparison . . . . . . . . . . 6-68

Use Experiment Manager to Train Generative Adversarial Networks


(GANs) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-83

Use Bayesian Optimization in Custom Training Experiments . . . . . . . . . 6-98

Custom Training with Multiple GPUs in Experiment Manager . . . . . . . 6-111

Offload Experiments as Batch Jobs to Cluster . . . . . . . . . . . . . . . . . . . . 6-124


Create Batch Job on Cluster . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-124
Track Progress of Batch Job . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-125
Interrupt Training in Batch Job . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-125
Retrieve Results and Clean Up Data . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-126

Keyboard Shortcuts for Experiment Manager . . . . . . . . . . . . . . . . . . . . 6-127


Shortcuts for General Navigation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-127
Shortcuts for Experiment Browser . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-127
Shortcuts for Results Table . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-128

Debug Experiments for Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . 6-129


Debug Built-In Training Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . 6-129
Debug Custom Training Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . 6-131

Deep Learning in Parallel and the Cloud


7
Scale Up Deep Learning in Parallel, on GPUs, and in the Cloud . . . . . . . . 7-2
Train Single Network in Parallel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-3
Train Multiple Networks in Parallel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-6
Batch Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-7
Manage Cluster Profiles and Automatic Pool Creation . . . . . . . . . . . . . . . . 7-8
Deep Learning Precision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-8

Deep Learning in the Cloud . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-10


Access MATLAB in the Cloud . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-10
Work with Big Data in the Cloud . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-12

Deep Learning with MATLAB on Multiple GPUs . . . . . . . . . . . . . . . . . . . . 7-14


Use Multiple GPUs in Local Machine . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-14
Use Multiple GPUs in Cluster . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-15
Optimize Mini-Batch Size and Learning Rate . . . . . . . . . . . . . . . . . . . . . 7-15
Select Particular GPUs to Use for Training . . . . . . . . . . . . . . . . . . . . . . . 7-15
Train Multiple Networks on Multiple GPUs . . . . . . . . . . . . . . . . . . . . . . . 7-16
Advanced Support for Fast Multi-Node GPU Communication . . . . . . . . . . 7-17

xii Contents
Deep Learning with Big Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-18
Work with Big Data in Parallel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-18
Preprocess Data in Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-18
Work with Big Data in the Cloud . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-19
Preprocess Data for Custom Training Loops . . . . . . . . . . . . . . . . . . . . . . 7-19

Run Custom Training Loops on a GPU and in Parallel . . . . . . . . . . . . . . . 7-21


Train Network on GPU . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-21
Train Single Network in Parallel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-22
Train Multiple Networks in Parallel . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-26
Use Experiment Manager to Train in Parallel . . . . . . . . . . . . . . . . . . . . . 7-28

Cloud AI Workflow Using the Deep Learning Container . . . . . . . . . . . . . 7-29

Train Network in the Cloud Using Automatic Parallel Support . . . . . . . . 7-30

Use parfeval to Train Multiple Deep Learning Networks . . . . . . . . . . . . . 7-34

Send Deep Learning Batch Job to Cluster . . . . . . . . . . . . . . . . . . . . . . . . . 7-41

Train Network Using Automatic Multi-GPU Support . . . . . . . . . . . . . . . . 7-44

Use parfor to Train Multiple Deep Learning Networks . . . . . . . . . . . . . . 7-48

Upload Deep Learning Data to the Cloud . . . . . . . . . . . . . . . . . . . . . . . . . 7-55

Train Network in Parallel with Custom Training Loop . . . . . . . . . . . . . . . 7-57

Train Network Using Federated Learning . . . . . . . . . . . . . . . . . . . . . . . . . 7-66

Train Network on Amazon Web Services Using MATLAB Deep Learning


Container . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-75

Use Amazon S3 Buckets with MATLAB Deep Learning Container . . . . . 7-79

Use Experiment Manager in the Cloud with MATLAB Deep Learning


Container . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-82

Computer Vision Examples


8
Gesture Recognition using Videos and Deep Learning . . . . . . . . . . . . . . . 8-2

Code Generation for Object Detection by Using Single Shot Multibox


Detector . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-23

Point Cloud Classification Using PointNet Deep Learning . . . . . . . . . . . 8-26

Activity Recognition from Video and Optical Flow Data Using Deep
Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-49

xiii
Import Pretrained ONNX YOLO v2 Object Detector . . . . . . . . . . . . . . . . . 8-77

Export YOLO v2 Object Detector to ONNX . . . . . . . . . . . . . . . . . . . . . . . . 8-84

Object Detection Using SSD Deep Learning . . . . . . . . . . . . . . . . . . . . . . . 8-90

Object Detection Using YOLO v3 Deep Learning . . . . . . . . . . . . . . . . . . 8-102

Object Detection Using YOLO v4 Deep Learning . . . . . . . . . . . . . . . . . . 8-117

Object Detection Using YOLO v2 Deep Learning . . . . . . . . . . . . . . . . . . 8-126

Semantic Segmentation Using Deep Learning . . . . . . . . . . . . . . . . . . . . 8-137

Semantic Segmentation Using Dilated Convolutions . . . . . . . . . . . . . . . 8-156

Train Simple Semantic Segmentation Network in Deep Network Designer


........................................................ 8-161

Semantic Segmentation of Multispectral Images Using Deep Learning


........................................................ 8-167

3-D Brain Tumor Segmentation Using Deep Learning . . . . . . . . . . . . . . 8-181

Define Custom Pixel Classification Layer with Tversky Loss . . . . . . . . . 8-191

Train Object Detector Using R-CNN Deep Learning . . . . . . . . . . . . . . . . 8-198

Object Detection Using Faster R-CNN Deep Learning . . . . . . . . . . . . . . 8-211

Perform Instance Segmentation Using Mask R-CNN . . . . . . . . . . . . . . . 8-221

Estimate Body Pose Using Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . 8-226

Generate Image from Segmentation Map Using Deep Learning . . . . . . 8-234

Image Processing Examples


9
Remove Noise from Color Image Using Pretrained Neural Network . . . . 9-2

Increase Image Resolution Using Deep Learning . . . . . . . . . . . . . . . . . . . . 9-8

JPEG Image Deblocking Using Deep Learning . . . . . . . . . . . . . . . . . . . . . 9-24

Image Processing Operator Approximation Using Deep Learning . . . . . 9-37

Develop Camera Processing Pipeline Using Deep Learning . . . . . . . . . . 9-52

Brighten Extremely Dark Images Using Deep Learning . . . . . . . . . . . . . 9-74

xiv Contents
Classify Tumors in Multiresolution Blocked Images . . . . . . . . . . . . . . . . 9-85

Unsupervised Day-to-Dusk Image Translation Using UNIT . . . . . . . . . . . 9-97

Quantify Image Quality Using Neural Image Assessment . . . . . . . . . . . 9-108

Neural Style Transfer Using Deep Learning . . . . . . . . . . . . . . . . . . . . . . 9-121

Unsupervised Medical Image Denoising Using CycleGAN . . . . . . . . . . . 9-130

Unsupervised Medical Image Denoising Using UNIT . . . . . . . . . . . . . . . 9-143

Detect Image Anomalies Using Explainable One-Class Classification


Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-156

Classify Defects on Wafer Maps Using Deep Learning . . . . . . . . . . . . . . 9-170

Detect Image Anomalies Using Pretrained ResNet-18 Feature


Embeddings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-186

Segment Lungs from CT Scan Using Pretrained Neural Network . . . . . 9-206

Brain MRI Segmentation Using Pretrained 3-D U-Net Network . . . . . . 9-215

Breast Tumor Segmentation from Ultrasound Using Deep Learning . . 9-223

Automated Driving Examples


10
Train a Deep Learning Vehicle Detector . . . . . . . . . . . . . . . . . . . . . . . . . . 10-2

Create Occupancy Grid Using Monocular Camera and Semantic


Segmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-12

Train Deep Learning Semantic Segmentation Network Using 3-D


Simulation Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-26

Lidar Examples
11
Code Generation for Lidar Object Detection Using SqueezeSegV2
Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-2

Lidar Object Detection Using Complex-YOLO v4 Network . . . . . . . . . . . . 11-8

Aerial Lidar Semantic Segmentation Using PointNet++ Deep Learning


........................................................ 11-25

xv
Code Generation For Aerial Lidar Semantic Segmentation Using PointNet
++ Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-35

Lidar Point Cloud Semantic Segmentation Using PointSeg Deep Learning


Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-41

Lidar Point Cloud Semantic Segmentation Using SqueezeSegV2 Deep


Learning Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-52

Code Generation for Lidar Point Cloud Segmentation Network . . . . . . 11-61

Lidar 3-D Object Detection Using PointPillars Deep Learning . . . . . . . 11-68

Signal Processing Examples


12
Learn Pre-Emphasis Filter Using Deep Learning . . . . . . . . . . . . . . . . . . . 12-2

Hand Gesture Classification Using Radar Signals and Deep Learning


........................................................ 12-12

Waveform Segmentation Using Deep Learning . . . . . . . . . . . . . . . . . . . 12-24

Classify ECG Signals Using Long Short-Term Memory Networks . . . . . 12-44

Generate Synthetic Signals Using Conditional GAN . . . . . . . . . . . . . . . 12-62

Classify Time Series Using Wavelet Analysis and Deep Learning . . . . . 12-77

Deploy Signal Classifier on NVIDIA Jetson Using Wavelet Analysis and


Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-94

Deploy Signal Classifier Using Wavelets and Deep Learning on Raspberry


Pi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-110

Deploy Signal Segmentation Deep Network on Raspberry Pi . . . . . . . 12-117

Anomaly Detection Using Autoencoder and Wavelets . . . . . . . . . . . . . 12-127

Fault Detection Using Wavelet Scattering and Recurrent Deep Networks


....................................................... 12-138

Parasite Classification Using Wavelet Scattering and Deep Learning 12-146

Detect Anomalies Using Wavelet Scattering with Autoencoders . . . . . 12-160

Denoise Signals with Adversarial Learning Denoiser Model . . . . . . . . 12-177

Human Health Monitoring Using Continuous Wave Radar and Deep


Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-191

xvi Contents
Wireless Comm Examples
13
Train DQN Agent for Beam Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-2

CSI Feedback with Autoencoders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-9

Modulation Classification by Using FPGA . . . . . . . . . . . . . . . . . . . . . . . . 13-36

Neural Network for Digital Predistortion Design - Offline Training . . . 13-48

Neural Network for Beam Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-62

Spectrum Sensing with Deep Learning to Identify 5G and LTE Signals


........................................................ 13-85

Autoencoders for Wireless Communications . . . . . . . . . . . . . . . . . . . . 13-100

Modulation Classification with Deep Learning . . . . . . . . . . . . . . . . . . . 13-116

Training and Testing a Neural Network for LLR Estimation . . . . . . . . 13-130

Design a Deep Neural Network with Simulated Data to Detect WLAN


Router Impersonation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-141

Test a Deep Neural Network with Captured Data to Detect WLAN Router
Impersonation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-156

Audio Examples
14
Transfer Learning with Pretrained Audio Networks . . . . . . . . . . . . . . . . . 14-2

Speech Command Recognition in Simulink . . . . . . . . . . . . . . . . . . . . . . . 14-5

Speaker Identification Using Custom SincNet Layer and Deep Learning


......................................................... 14-8

Dereverberate Speech Using Deep Learning Networks . . . . . . . . . . . . . 14-22

Speaker Recognition Using x-vectors . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-46

Speaker Diarization Using x-vectors . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-61

Train Spoken Digit Recognition Network Using Out-of-Memory Audio


Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-75

Train Spoken Digit Recognition Network Using Out-of-Memory Features


........................................................ 14-83

xvii
Keyword Spotting in Noise Code Generation with Intel MKL-DNN . . . 14-90

Keyword Spotting in Noise Code Generation on Raspberry Pi . . . . . . . 14-96

Speech Command Recognition Code Generation on Raspberry Pi . . . 14-104

Speech Command Recognition Code Generation with Intel MKL-DNN


....................................................... 14-114

Train Generative Adversarial Network (GAN) for Sound Synthesis . . 14-122

Sequential Feature Selection for Audio Features . . . . . . . . . . . . . . . . . 14-144

Acoustic Scene Recognition Using Late Fusion . . . . . . . . . . . . . . . . . . 14-157

Keyword Spotting in Noise Using MFCC and LSTM Networks . . . . . . 14-173

Speech Emotion Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-199

Spoken Digit Recognition with Wavelet Scattering and Deep Learning


....................................................... 14-212

Cocktail Party Source Separation Using Deep Learning Networks . . . 14-228

Voice Activity Detection in Noise Using Deep Learning . . . . . . . . . . . . 14-251

Denoise Speech Using Deep Learning Networks . . . . . . . . . . . . . . . . . 14-275

Accelerate Audio Deep Learning Using GPU-Based Feature Extraction


....................................................... 14-296

Acoustics-Based Machine Fault Recognition . . . . . . . . . . . . . . . . . . . . 14-307

Acoustics-Based Machine Fault Recognition Code Generation with Intel


MKL-DNN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-327

Acoustics-Based Machine Fault Recognition Code Generation on


Raspberry Pi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-334

End-to-End Deep Speech Separation . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-345

Train 3-D Sound Event Localization and Detection (SELD) Using Deep
Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-361

3-D Sound Event Localization and Detection Using Trained Recurrent


Convolutional Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-388

Speech Command Recognition Code Generation with Intel MKL-DNN


Using Simulink . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-402

Speech Command Recognition on Raspberry Pi Using Simulink . . . . 14-412

Audio-Based Anomaly Detection for Machine Health Monitoring . . . 14-419

xviii Contents
3-D Speech Enhancement Using Trained Filter and Sum Network . . . 14-432

Train 3-D Speech Enhancement Network Using Deep Learning . . . . . 14-442

Audio Transfer Learning Using Experiment Manager . . . . . . . . . . . . . 14-465

Reinforcement Learning Examples


15
Reinforcement Learning Using Deep Neural Networks . . . . . . . . . . . . . . 15-2
Reinforcement Learning Workflow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-3
Reinforcement Learning Environments . . . . . . . . . . . . . . . . . . . . . . . . . . 15-3
Reinforcement Learning Agents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-4
Create Deep Neural Network Policies and Value Functions . . . . . . . . . . . 15-5
Train Reinforcement Learning Agents . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-6
Deploy Trained Policies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-6

Create Simulink Environment and Train Agent . . . . . . . . . . . . . . . . . . . . 15-8

Train DDPG Agent to Swing Up and Balance Pendulum with Image


Observation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-17

Create Agent Using Deep Network Designer and Train Using Image
Observations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-25

Imitate MPC Controller for Lane Keeping Assist . . . . . . . . . . . . . . . . . . 15-38

Train DDPG Agent to Control Flying Robot . . . . . . . . . . . . . . . . . . . . . . . 15-46

Train Biped Robot to Walk Using Reinforcement Learning Agents . . . 15-52

Train Humanoid Walker . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-64

Train DDPG Agent for Adaptive Cruise Control . . . . . . . . . . . . . . . . . . . 15-71

Train DQN Agent for Lane Keeping Assist Using Parallel Computing . 15-80

Train DDPG Agent for Path-Following Control . . . . . . . . . . . . . . . . . . . . 15-88

Train PPO Agent for Automatic Parking Valet . . . . . . . . . . . . . . . . . . . . 15-96

Predictive Maintenance Examples


16
Chemical Process Fault Detection Using Deep Learning . . . . . . . . . . . . . 16-2

Rolling Element Bearing Fault Diagnosis Using Deep Learning . . . . . . 16-12

xix
Remaining Useful Life Estimation Using Convolutional Neural Network
........................................................ 16-23

Anomaly Detection in Industrial Machinery Using Three-Axis Vibration


Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16-34

Battery Cycle Life Prediction Using Deep Learning . . . . . . . . . . . . . . . 16-49

Computational Finance Examples


17
Compare Deep Learning Networks for Credit Default Prediction . . . . . . 17-2

Interpret and Stress-Test Deep Learning Networks for Probability of


Default . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-15

Hedge Options Using Reinforcement Learning Toolbox™ . . . . . . . . . . . 17-33

Use Deep Learning to Approximate Barrier Option Prices with Heston


Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-42

Backtest Strategies Using Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . 17-51

Import, Export, and Customization


18
Train Deep Learning Model in MATLAB . . . . . . . . . . . . . . . . . . . . . . . . . . 18-3
Training Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-3
Decisions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-5

Define Custom Deep Learning Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-9


Layer Templates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-10
Intermediate Layer Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-13
Output Layer Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-14
Check Validity of Custom Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-15

Define Custom Deep Learning Intermediate Layers . . . . . . . . . . . . . . . . 18-16


Intermediate Layer Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-16
Intermediate Layer Template . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-17
Formatted Inputs and Outputs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-19
Custom Layer Acceleration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-19
Intermediate Layer Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-20
Forward Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-23
Reset State Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-26
Backward Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-26
GPU Compatibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-29
Check Validity of Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-29

xx Contents
Define Custom Deep Learning Output Layers . . . . . . . . . . . . . . . . . . . . . 18-31
Output Layer Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-31
Output Layer Templates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-31
Custom Layer Acceleration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-33
Output Layer Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-34
Forward Loss Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-35
Backward Loss Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-35
GPU Compatibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-37
Check Validity of Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-37

Define Custom Deep Learning Layer with Learnable Parameters . . . . . 18-38


Intermediate Layer Template . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-39
Name Layer and Specify Superclasses . . . . . . . . . . . . . . . . . . . . . . . . . 18-41
Declare Properties and Learnable Parameters . . . . . . . . . . . . . . . . . . . 18-42
Create Constructor Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-44
Create Initialize Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-45
Create Forward Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-45
Completed Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-48
GPU Compatibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-49
Check Validity of Custom Layer Using checkLayer . . . . . . . . . . . . . . . . 18-50
Include Custom Layer in Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-50

Define Custom Deep Learning Layer with Multiple Inputs . . . . . . . . . . 18-53


Intermediate Layer Template . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-53
Name Layer and Specify Superclasses . . . . . . . . . . . . . . . . . . . . . . . . . 18-56
Declare Properties and Learnable Parameters . . . . . . . . . . . . . . . . . . . 18-56
Create Constructor Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-58
Create Forward Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-59
Completed Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-62
GPU Compatibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-63
Check Validity of Layer with Multiple Inputs . . . . . . . . . . . . . . . . . . . . . 18-64
Use Custom Weighted Addition Layer in Network . . . . . . . . . . . . . . . . . 18-64

Define Custom Deep Learning Layer with Formatted Inputs . . . . . . . . . 18-67


Intermediate Layer Template . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-68
Name Layer and Specify Superclasses . . . . . . . . . . . . . . . . . . . . . . . . . 18-70
Declare Properties and Learnable Parameters . . . . . . . . . . . . . . . . . . . 18-71
Create Constructor Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-72
Create Initialize Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-74
Create Forward Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-75
Completed Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-78
GPU Compatibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-80
Include Custom Layer in Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-80

Define Custom Recurrent Deep Learning Layer . . . . . . . . . . . . . . . . . . . 18-83


Intermediate Layer Template . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-84
Name Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-86
Declare Properties, State, and Learnable Parameters . . . . . . . . . . . . . . 18-86
Create Constructor Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-88
Create Initialize Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-90
Create Predict Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-91
Create Reset State Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-93
Completed Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-94
GPU Compatibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-96
Include Custom Layer in Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-97

xxi
Define Custom Classification Output Layer . . . . . . . . . . . . . . . . . . . . . . . 18-99
Classification Output Layer Template . . . . . . . . . . . . . . . . . . . . . . . . . . 18-99
Name the Layer and Specify Superclasses . . . . . . . . . . . . . . . . . . . . . 18-100
Declare Layer Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-101
Create Constructor Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-101
Create Forward Loss Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-102
Completed Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-103
GPU Compatibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-104
Check Output Layer Validity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-104
Include Custom Classification Output Layer in Network . . . . . . . . . . . 18-104

Define Custom Regression Output Layer . . . . . . . . . . . . . . . . . . . . . . . . 18-107


Regression Output Layer Template . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-107
Name the Layer and Specify Superclasses . . . . . . . . . . . . . . . . . . . . . 18-108
Declare Layer Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-108
Create Constructor Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-109
Create Forward Loss Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-110
Completed Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-111
GPU Compatibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-112
Check Output Layer Validity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-112
Include Custom Regression Output Layer in Network . . . . . . . . . . . . . 18-113

Specify Custom Layer Backward Function . . . . . . . . . . . . . . . . . . . . . . 18-115


Create Custom Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-115
Create Backward Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-117
Complete Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-119
GPU Compatibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-120

Specify Custom Output Layer Backward Loss Function . . . . . . . . . . . . 18-122


Create Custom Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-122
Create Backward Loss Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-123
Complete Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-123
GPU Compatibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-124

Custom Layer Function Acceleration . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-126


Acceleration Considerations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-126

Deep Learning Network Composition . . . . . . . . . . . . . . . . . . . . . . . . . . 18-129


Automatically Initialize Learnable dlnetwork Objects for Training . . . . 18-129
Predict and Forward Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-130
GPU Compatibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-131

Define Nested Deep Learning Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-132


Intermediate Layer Template . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-133
Name Layer and Specify Superclasses . . . . . . . . . . . . . . . . . . . . . . . . 18-135
Declare Properties and Learnable Parameters . . . . . . . . . . . . . . . . . . 18-136
Create Constructor Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-138
Create Forward Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-141
Completed Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-143
GPU Compatibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-145

Train Deep Learning Network with Nested Layers . . . . . . . . . . . . . . . . 18-147

Define Custom Deep Learning Layer for Code Generation . . . . . . . . . 18-154


Intermediate Layer Template . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-155

xxii Contents
Name Layer and Specify Superclasses . . . . . . . . . . . . . . . . . . . . . . . . 18-157
Specify Code Generation Pragma . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-158
Declare Properties and Learnable Parameters . . . . . . . . . . . . . . . . . . 18-158
Create Constructor Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-160
Create Forward Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-161
Completed Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-163
Check Custom Layer for Code Generation Compatibility . . . . . . . . . . . 18-164

Check Custom Layer Validity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-166


Check Custom Layer Validity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-166
List of Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-167
Generated Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-169
Diagnostics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-170

Specify Custom Weight Initialization Function . . . . . . . . . . . . . . . . . . 18-187

Compare Layer Weight Initializers . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-193

Assemble Network from Pretrained Keras Layers . . . . . . . . . . . . . . . . 18-199

Replace Unsupported Keras Layer with Function Layer . . . . . . . . . . . 18-204

Assemble Multiple-Output Network for Prediction . . . . . . . . . . . . . . . 18-208

Automatic Differentiation Background . . . . . . . . . . . . . . . . . . . . . . . . . 18-212


What Is Automatic Differentiation? . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-212
Forward Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-212
Reverse Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-214

Use Automatic Differentiation In Deep Learning Toolbox . . . . . . . . . . 18-217


Custom Training and Calculations Using Automatic Differentiation . . . 18-217
Use dlgradient and dlfeval Together for Automatic Differentiation . . . 18-218
Derivative Trace . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-218
Characteristics of Automatic Derivatives . . . . . . . . . . . . . . . . . . . . . . 18-219

Define Custom Training Loops, Loss Functions, and Networks . . . . . . 18-221


Define Deep Learning Network for Custom Training Loops . . . . . . . . . 18-221
Specify Loss Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-225
Update Learnable Parameters Using Automatic Differentiation . . . . . . 18-226

Specify Training Options in Custom Training Loop . . . . . . . . . . . . . . . 18-228


Solver Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-229
Learn Rate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-229
Plots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-230
Verbose Output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-231
Mini-Batch Size . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-232
Number of Epochs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-232
Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-232
L2 Regularization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-234
Gradient Clipping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-234
Single CPU or GPU Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-235
Checkpoints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-235

Train Network Using Custom Training Loop . . . . . . . . . . . . . . . . . . . . . 18-237

xxiii
Define Model Loss Function for Custom Training Loop . . . . . . . . . . . . 18-245
Create Model Loss Function for Model Defined as dlnetwork Object . . 18-245
Create Model Loss Function for Model Defined as Function . . . . . . . . 18-245
Evaluate Model Loss Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-246
Update Learnable Parameters Using Gradients . . . . . . . . . . . . . . . . . . 18-246
Use Model Loss Function in Custom Training Loop . . . . . . . . . . . . . . . 18-247
Debug Model Loss Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-247

Update Batch Normalization Statistics in Custom Training Loop . . . 18-250

Train Robust Deep Learning Network with Jacobian Regularization 18-256

Make Predictions Using dlnetwork Object . . . . . . . . . . . . . . . . . . . . . . 18-269

Train Network Using Model Function . . . . . . . . . . . . . . . . . . . . . . . . . . 18-273

Update Batch Normalization Statistics Using Model Function . . . . . . 18-287

Make Predictions Using Model Function . . . . . . . . . . . . . . . . . . . . . . . 18-301

Initialize Learnable Parameters for Model Function . . . . . . . . . . . . . . 18-307


Default Layer Initializations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-307
Learnable Parameter Sizes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-308
Glorot Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-311
He Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-313
Gaussian Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-314
Uniform Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-315
Orthogonal Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-315
Unit Forget Gate Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-316
Ones Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-316
Zeros Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-317
Storing Learnable Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-317

Deep Learning Function Acceleration for Custom Training Loops . . . 18-319


Accelerate Deep Learning Function Directly . . . . . . . . . . . . . . . . . . . . 18-320
Accelerate Parts of Deep Learning Function . . . . . . . . . . . . . . . . . . . . 18-320
Reusing Caches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-321
Storing and Clearing Caches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-322
Acceleration Considerations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-322

Accelerate Custom Training Loop Functions . . . . . . . . . . . . . . . . . . . . 18-327

Evaluate Performance of Accelerated Deep Learning Function . . . . . 18-339

Check Accelerated Deep Learning Function Outputs . . . . . . . . . . . . . 18-354

Solve Partial Differential Equations Using Deep Learning . . . . . . . . . 18-357

Solve Partial Differential Equation with LBFGS Method and Deep


Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-367

Solve Ordinary Differential Equation Using Neural Network . . . . . . . 18-376

Dynamical System Modeling Using Neural ODE . . . . . . . . . . . . . . . . . . 18-384

xxiv Contents
Node Classification Using Graph Convolutional Network . . . . . . . . . . 18-393

Multilabel Graph Classification Using Graph Attention Networks . . . 18-408

Train Network Using Cyclical Learning Rate for Snapshot Ensembling


....................................................... 18-433

Interoperability Between Deep Learning Toolbox, TensorFlow, PyTorch,


and ONNX . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-444
Support Packages for Interoperability . . . . . . . . . . . . . . . . . . . . . . . . . 18-444
Functions that Import Deep Learning Networks . . . . . . . . . . . . . . . . . 18-445
Visualize Imported Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-446
Predict with Imported Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-448
Transfer Learning with Imported Network . . . . . . . . . . . . . . . . . . . . . 18-450
Deploy Imported Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-451
Functions that Export Networks and Layer Graphs . . . . . . . . . . . . . . . 18-452

Tips on Importing Models from TensorFlow, PyTorch, and ONNX . . . 18-454


Import Functions of Deep Learning Toolbox . . . . . . . . . . . . . . . . . . . . 18-454
Recommended Functions to Import TensorFlow Models . . . . . . . . . . . 18-454
Autogenerated Custom Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-455
Placeholder Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-456
Input Dimension Ordering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-457
Data Formats for Prediction with dlnetwork . . . . . . . . . . . . . . . . . . . . 18-457
Input Data Preprocessing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-458

Deploy Imported TensorFlow Model with MATLAB Compiler . . . . . . . 18-460

Select Function to Import ONNX Pretrained Network . . . . . . . . . . . . . 18-465


Decisions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-465
Actions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-467

Classify Images in Simulink with Imported TensorFlow Network . . . . 18-469

Inference Comparison Between TensorFlow and Imported Networks for


Image Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-476

Inference Comparison Between ONNX and Imported Networks for Image


Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-480

List of Functions with dlarray Support . . . . . . . . . . . . . . . . . . . . . . . . . 18-484


Deep Learning Toolbox Functions with dlarray Support . . . . . . . . . . . 18-484
Domain-Specific Functions with dlarray Support . . . . . . . . . . . . . . . . 18-487
MATLAB Functions with dlarray Support . . . . . . . . . . . . . . . . . . . . . . 18-488
Notable dlarray Behaviors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-498

Monitor Custom Training Loop Progress . . . . . . . . . . . . . . . . . . . . . . . 18-501


Create Training Progress Monitor . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-501
Training Progress Window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-501
Monitor Custom Training Loop Progress During Training . . . . . . . . . . 18-503

Train Bayesian Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-511

xxv
Deep Learning Data Preprocessing
19
Datastores for Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-2
Select Datastore . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-2
Input Datastore for Training, Validation, and Inference . . . . . . . . . . . . . . 19-3
Specify Read Size and Mini-Batch Size . . . . . . . . . . . . . . . . . . . . . . . . . . 19-5
Transform and Combine Datastores . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-6
Use Datastore for Parallel Training and Background Dispatching . . . . . . 19-8

Create and Explore Datastore for Image Classification . . . . . . . . . . . . 19-10

Preprocess Images for Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-16


Resize Images Using Rescaling and Cropping . . . . . . . . . . . . . . . . . . . . 19-16
Augment Images for Training with Random Geometric Transformations
.................................................... 19-17
Perform Additional Image Processing Operations Using Built-In Datastores
.................................................... 19-18
Apply Custom Image Processing Pipelines Using Combine and Transform
.................................................... 19-18

Preprocess Volumes for Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . 19-20


Read Volumetric Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-20
Pair Image and Label Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-21
Preprocess Volumetric Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-21
Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-22

Preprocess Data for Domain-Specific Deep Learning Applications . . . . 19-27


Image Processing Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-27
Object Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-29
Semantic Segmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-30
Lidar Processing Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-31
Signal Processing Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-33
Audio Processing Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-35
Text Analytics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-37

Develop Custom Mini-Batch Datastore . . . . . . . . . . . . . . . . . . . . . . . . . . 19-38


Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-38
Implement MiniBatchable Datastore . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-38
Add Support for Shuffling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-43
Validate Custom Mini-Batch Datastore . . . . . . . . . . . . . . . . . . . . . . . . . 19-43

Augment Images for Deep Learning Workflows . . . . . . . . . . . . . . . . . . . 19-45

Augment Pixel Labels for Semantic Segmentation . . . . . . . . . . . . . . . . 19-67

Augment Bounding Boxes for Object Detection . . . . . . . . . . . . . . . . . . . 19-77

Prepare Datastore for Image-to-Image Regression . . . . . . . . . . . . . . . . 19-90

Train Network Using Out-of-Memory Sequence Data . . . . . . . . . . . . . . . 19-97

Train Network Using Custom Mini-Batch Datastore for Sequence Data


....................................................... 19-102

xxvi Contents
Classify Out-of-Memory Text Data Using Deep Learning . . . . . . . . . . . 19-106

Classify Out-of-Memory Text Data Using Custom Mini-Batch Datastore


....................................................... 19-112

Data Sets for Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-116


Image Data Sets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-116
Time Series and Signal Data Sets . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-137
Video Data Sets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-146
Text Data Sets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-147
Audio Data Sets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-153
Point Cloud Data Sets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-158

Choose an App to Label Ground Truth Data . . . . . . . . . . . . . . . . . . . . . 19-163

Deep Learning Code Generation


20
Code Generation for Deep Learning Networks . . . . . . . . . . . . . . . . . . . . . 20-3

Code Generation for Semantic Segmentation Network . . . . . . . . . . . . . 20-10

Lane Detection Optimized with GPU Coder . . . . . . . . . . . . . . . . . . . . . . . 20-14

Code Generation for a Sequence-to-Sequence LSTM Network . . . . . . . 20-21

Deep Learning Prediction on ARM Mali GPU . . . . . . . . . . . . . . . . . . . . . 20-27

Code Generation for Object Detection by Using YOLO v2 . . . . . . . . . . . 20-30

Code Generation For Object Detection Using YOLO v3 Deep Learning


........................................................ 20-34

Code Generation for Object Detection Using YOLO v4 Deep Learning


........................................................ 20-38

Deep Learning Prediction with NVIDIA TensorRT Library . . . . . . . . . . . 20-43

Traffic Sign Detection and Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . 20-49

Logo Recognition Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-57

Code Generation for Denoising Deep Neural Network . . . . . . . . . . . . . . 20-62

Train and Deploy Fully Convolutional Networks for Semantic


Segmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-66

Code Generation for Semantic Segmentation Network That Uses U-net


........................................................ 20-78

Code Generation for Deep Learning on ARM Targets . . . . . . . . . . . . . . 20-85

xxvii
Deep Learning Prediction with ARM Compute Using codegen . . . . . . . 20-90

Deep Learning Code Generation on Intel Targets for Different Batch Sizes
........................................................ 20-95

Generate C++ Code for Object Detection Using YOLO v2 and Intel MKL-
DNN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-104

Code Generation and Deployment of MobileNet-v2 Network to Raspberry


Pi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-107

Code Generation for Semantic Segmentation Application on Intel CPUs


That Uses U-Net . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-111

Code Generation for Semantic Segmentation Application on ARM Neon


Targets That Uses U-Net . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-120

Code Generation for LSTM Network on Raspberry Pi . . . . . . . . . . . . . 20-129

Code Generation for LSTM Network That Uses Intel MKL-DNN . . . . . 20-136

Cross Compile Deep Learning Code for ARM Neon Targets . . . . . . . . 20-140

Generate Generic C/C++ Code for Sequence-to-Sequence Regression That


Uses Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-146

Quantize Residual Network Trained for Image Classification and Generate


CUDA Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-155

Quantize Layers in Object Detectors and Generate CUDA Code . . . . . 20-163

Parameter Pruning and Quantization of Image Classification Network


....................................................... 20-174

Prune Image Classification Network Using Taylor Scores . . . . . . . . . . 20-191

Quantization Workflow Prerequisites . . . . . . . . . . . . . . . . . . . . . . . . . . 20-205


Prerequisites for All Quantization Workflows . . . . . . . . . . . . . . . . . . . 20-205
Supported Networks and Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-205
Prerequisites for Calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-205
Prerequisites for Quantization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-206
Prerequisites for Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-206

Quantization of Deep Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . 20-208


Precision and Range . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-208
Histograms of Dynamic Ranges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-208

Prune Filters in a Detection Network Using Taylor Scores . . . . . . . . . 20-216

Prerequisites for Deep Learning with TensorFlow Lite Models . . . . . . 20-244


MathWorks Products . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-244
Third-Party Hardware and Software . . . . . . . . . . . . . . . . . . . . . . . . . . 20-244
Environment Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-245

xxviii Contents
Generate Code for TensorFlow Lite (TFLite) Model and Deploy on
Raspberry Pi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-247

Deploy Super Resolution Application That Uses TensorFlow Lite (TFLite)


Model on Host and Raspberry Pi . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-251

Neural Network Objects, Data, and Training Styles


21
Workflow for Neural Network Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21-2

Four Levels of Neural Network Design . . . . . . . . . . . . . . . . . . . . . . . . . . . 21-3

Neuron Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21-4


Simple Neuron . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21-4
Transfer Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21-5
Neuron with Vector Input . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21-5

Neural Network Architectures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21-8


One Layer of Neurons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21-8
Multiple Layers of Neurons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21-10
Input and Output Processing Functions . . . . . . . . . . . . . . . . . . . . . . . . 21-11

Create Neural Network Object . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21-13

Configure Shallow Neural Network Inputs and Outputs . . . . . . . . . . . . 21-16

Understanding Shallow Network Data Structures . . . . . . . . . . . . . . . . . 21-18


Simulation with Concurrent Inputs in a Static Network . . . . . . . . . . . . 21-18
Simulation with Sequential Inputs in a Dynamic Network . . . . . . . . . . . 21-19
Simulation with Concurrent Inputs in a Dynamic Network . . . . . . . . . . 21-20

Neural Network Training Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21-22


Incremental Training with adapt . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21-22
Batch Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21-24
Training Feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21-26

Multilayer Shallow Neural Networks and Backpropagation


Training
22
Multilayer Shallow Neural Networks and Backpropagation Training . . . 22-2

Multilayer Shallow Neural Network Architecture . . . . . . . . . . . . . . . . . . 22-3


Neuron Model (logsig, tansig, purelin) . . . . . . . . . . . . . . . . . . . . . . . . . . 22-3
Feedforward Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22-4

Prepare Data for Multilayer Shallow Neural Networks . . . . . . . . . . . . . . 22-6

xxix
Choose Neural Network Input-Output Processing Functions . . . . . . . . . 22-7
Representing Unknown or Don't-Care Targets . . . . . . . . . . . . . . . . . . . . 22-8

Divide Data for Optimal Neural Network Training . . . . . . . . . . . . . . . . . . 22-9

Create, Configure, and Initialize Multilayer Shallow Neural Networks


........................................................ 22-11
Other Related Architectures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22-11
Initializing Weights (init) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22-12

Train and Apply Multilayer Shallow Neural Networks . . . . . . . . . . . . . . 22-13


Training Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22-13
Training Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22-15
Use the Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22-17

Analyze Shallow Neural Network Performance After Training . . . . . . . 22-19


Improving Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22-22

Limitations and Cautions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22-23

Dynamic Neural Networks


23
Introduction to Dynamic Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . 23-2

How Dynamic Neural Networks Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23-3


Feedforward and Recurrent Neural Networks . . . . . . . . . . . . . . . . . . . . . 23-3
Applications of Dynamic Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23-9
Dynamic Network Structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23-10
Dynamic Network Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23-11

Design Time Series Time-Delay Neural Networks . . . . . . . . . . . . . . . . . 23-12


Prepare Input and Layer Delay States . . . . . . . . . . . . . . . . . . . . . . . . . . 23-15

Design Time Series Distributed Delay Neural Networks . . . . . . . . . . . . 23-16

Design Time Series NARX Feedback Neural Networks . . . . . . . . . . . . . 23-18


Multiple External Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23-24

Design Layer-Recurrent Neural Networks . . . . . . . . . . . . . . . . . . . . . . . 23-26

Create Reference Model Controller with MATLAB Script . . . . . . . . . . . 23-29

Multiple Sequences with Dynamic Neural Networks . . . . . . . . . . . . . . . 23-38

Neural Network Time-Series Utilities . . . . . . . . . . . . . . . . . . . . . . . . . . . 23-39

Train Neural Networks with Error Weights . . . . . . . . . . . . . . . . . . . . . . . 23-41

Normalize Errors of Multiple Outputs . . . . . . . . . . . . . . . . . . . . . . . . . . . 23-45

xxx Contents
Multistep Neural Network Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . 23-52
Set Up in Open-Loop Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23-52
Multistep Closed-Loop Prediction From Initial Conditions . . . . . . . . . . . 23-53
Multistep Closed-Loop Prediction Following Known Sequence . . . . . . . 23-54
Following Closed-Loop Simulation with Open-Loop Simulation . . . . . . . 23-55

Control Systems
24
Introduction to Neural Network Control Systems . . . . . . . . . . . . . . . . . . 24-2

Design Neural Network Predictive Controller in Simulink . . . . . . . . . . . 24-4


System Identification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24-4
Predictive Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24-5
Use the Neural Network Predictive Controller Block . . . . . . . . . . . . . . . . 24-6

Design NARMA-L2 Neural Controller in Simulink . . . . . . . . . . . . . . . . . 24-13


Identification of the NARMA-L2 Model . . . . . . . . . . . . . . . . . . . . . . . . . 24-13
NARMA-L2 Controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24-14
Use the NARMA-L2 Controller Block . . . . . . . . . . . . . . . . . . . . . . . . . . 24-15

Design Model-Reference Neural Controller in Simulink . . . . . . . . . . . . 24-19


Use the Model Reference Controller Block . . . . . . . . . . . . . . . . . . . . . . 24-20

Import-Export Neural Network Simulink Control Systems . . . . . . . . . . 24-26


Import and Export Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24-26
Import and Export Training Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24-28

Radial Basis Neural Networks


25
Introduction to Radial Basis Neural Networks . . . . . . . . . . . . . . . . . . . . . 25-2
Important Radial Basis Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25-2

Radial Basis Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25-3


Neuron Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25-3
Network Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25-4
Exact Design (newrbe) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25-5
More Efficient Design (newrb) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25-6
Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25-6

Probabilistic Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25-8


Network Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25-8
Design (newpnn) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25-9

Generalized Regression Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . 25-11


Network Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25-11
Design (newgrnn) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25-12

xxxi
Self-Organizing and Learning Vector Quantization Networks
26
Introduction to Self-Organizing and LVQ . . . . . . . . . . . . . . . . . . . . . . . . . 26-2
Important Self-Organizing and LVQ Functions . . . . . . . . . . . . . . . . . . . . . 26-2

Cluster with a Competitive Neural Network . . . . . . . . . . . . . . . . . . . . . . . 26-3


Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-3
Create a Competitive Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . 26-3
Kohonen Learning Rule (learnk) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-4
Bias Learning Rule (learncon) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-5
Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-5
Graphical Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-6

Cluster with Self-Organizing Map Neural Network . . . . . . . . . . . . . . . . . 26-8


Topologies (gridtop, hextop, randtop) . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-9
Distance Functions (dist, linkdist, mandist, boxdist) . . . . . . . . . . . . . . . 26-12
Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-14
Create a Self-Organizing Map Neural Network (selforgmap) . . . . . . . . . 26-14
Training (learnsomb) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-16
Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-19

Learning Vector Quantization (LVQ) Neural Networks . . . . . . . . . . . . . 26-27


Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-27
Creating an LVQ Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-28
LVQ1 Learning Rule (learnlv1) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-30
Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-31
Supplemental LVQ2.1 Learning Rule (learnlv2) . . . . . . . . . . . . . . . . . . . 26-32

Adaptive Filters and Adaptive Training


27
Adaptive Neural Network Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27-2
Adaptive Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27-2
Linear Neuron Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27-2
Adaptive Linear Network Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . 27-3
Least Mean Square Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27-5
LMS Algorithm (learnwh) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27-6
Adaptive Filtering (adapt) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27-6

Advanced Topics
28
Shallow Neural Networks with Parallel and GPU Computing . . . . . . . . . 28-2
Modes of Parallelism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-2
Distributed Computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-2
Single GPU Computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-4
Distributed GPU Computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-6

xxxii Contents
Parallel Time Series . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-8
Parallel Availability, Fallbacks, and Feedback . . . . . . . . . . . . . . . . . . . . . 28-8

Optimize Neural Network Training Speed and Memory . . . . . . . . . . . . . 28-10


Memory Reduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-10
Fast Elliot Sigmoid . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-10

Choose a Multilayer Neural Network Training Function . . . . . . . . . . . . 28-14


SIN Data Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-15
PARITY Data Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-16
ENGINE Data Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-18
CANCER Data Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-19
CHOLESTEROL Data Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-21
DIABETES Data Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-22
Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-24

Improve Shallow Neural Network Generalization and Avoid Overfitting


........................................................ 28-25
Retraining Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-26
Multiple Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-27
Early Stopping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-28
Index Data Division (divideind) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-28
Random Data Division (dividerand) . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-29
Block Data Division (divideblock) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-29
Interleaved Data Division (divideint) . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-29
Regularization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-29
Summary and Discussion of Early Stopping and Regularization . . . . . . 28-31
Posttraining Analysis (regression) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-33

Edit Shallow Neural Network Properties . . . . . . . . . . . . . . . . . . . . . . . . . 28-35


Custom Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-35
Network Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-36
Network Behavior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-43

Custom Neural Network Helper Functions . . . . . . . . . . . . . . . . . . . . . . . 28-45

Automatically Save Checkpoints During Neural Network Training . . . 28-46

Deploy Shallow Neural Network Functions . . . . . . . . . . . . . . . . . . . . . . . 28-48


Deployment Functions and Tools for Trained Networks . . . . . . . . . . . . . 28-48
Generate Neural Network Functions for Application Deployment . . . . . 28-48
Generate Simulink Diagrams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28-50

Deploy Training of Shallow Neural Networks . . . . . . . . . . . . . . . . . . . . . 28-51

Historical Neural Networks


29
Historical Neural Networks Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-2

Perceptron Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-3


Neuron Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-3

xxxiii
Perceptron Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-4
Create a Perceptron . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-5
Perceptron Learning Rule (learnp) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-6
Training (train) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-8
Limitations and Cautions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-12

Linear Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-14


Neuron Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-14
Network Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-15
Least Mean Square Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-17
Linear System Design (newlind) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-18
Linear Networks with Delays . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-18
LMS Algorithm (learnwh) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-20
Linear Classification (train) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-21
Limitations and Cautions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29-23

Neural Network Object Reference


30
Neural Network Object Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30-2
General . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30-2
Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30-2
Subobject Structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30-5
Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30-6
Weight and Bias Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30-9

Neural Network Subobject Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . 30-11


Inputs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30-11
Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30-12
Outputs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30-16
Biases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30-18
Input Weights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30-19
Layer Weights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30-20

Function Approximation, Clustering, and Control Examples


31
Fit Data Using the Neural Net Fitting App . . . . . . . . . . . . . . . . . . . . . . . . 31-2

Classify Patterns Using the Neural Net Pattern Recognition App . . . . 31-11

Cluster Data Using the Neural Net Clustering App . . . . . . . . . . . . . . . . 31-19

Fit Time Series Data Using the Neural Net Time Series App . . . . . . . . 31-26

Body Fat Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-36

Crab Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-43

xxxiv Contents
Wine Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-51

Cancer Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-59

Character Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-68

Train Stacked Autoencoders for Image Classification . . . . . . . . . . . . . . 31-74

Iris Clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-89

Gene Expression Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-97

Maglev Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-105

Competitive Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-116

One-Dimensional Self-Organizing Map . . . . . . . . . . . . . . . . . . . . . . . . . 31-120

Two-Dimensional Self-Organizing Map . . . . . . . . . . . . . . . . . . . . . . . . . 31-123

Radial Basis Approximation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-127

Radial Basis Underlapping Neurons . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-131

Radial Basis Overlapping Neurons . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-133

GRNN Function Approximation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-135

PNN Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-139

Learning Vector Quantization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-143

Linear Prediction Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-147

Adaptive Linear Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-151

Classification with a Two-Input Perceptron . . . . . . . . . . . . . . . . . . . . . 31-156

Outlier Input Vectors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-161

Normalized Perceptron Rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-167

Linearly Non-separable Vectors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-173

Pattern Association Showing Error Surface . . . . . . . . . . . . . . . . . . . . . 31-176

Training a Linear Neuron . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-179

Linear Fit of Nonlinear Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-184

Underdetermined Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-189

Linearly Dependent Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-195

xxxv
Too Large a Learning Rate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-198

Adaptive Noise Cancellation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31-203

Shallow Neural Networks Bibliography


32
Shallow Neural Networks Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . 32-2

Mathematical Notation
A
Mathematics and Code Equivalents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A-2
Mathematics Notation to MATLAB Notation . . . . . . . . . . . . . . . . . . . . . . . A-2
Figure Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A-2

Neural Network Blocks for the Simulink Environment


B
Neural Network Simulink Block Library . . . . . . . . . . . . . . . . . . . . . . . . . . . B-2
Transfer Function Blocks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . B-2
Net Input Blocks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . B-3
Weight Blocks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . B-3
Processing Blocks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . B-3

Deploy Shallow Neural Network Simulink Diagrams . . . . . . . . . . . . . . . . . B-5


Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . B-5
Suggested Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . B-7
Generate Functions and Objects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . B-7

Code Notes
C
Deep Learning Toolbox Data Conventions . . . . . . . . . . . . . . . . . . . . . . . . . . C-2
Dimensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . C-2
Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . C-2

xxxvi Contents
1

Deep Networks

• “Deep Learning in MATLAB” on page 1-2


• “Pretrained Deep Neural Networks” on page 1-11
• “Learn About Convolutional Neural Networks” on page 1-21
• “Example Deep Learning Networks Architectures” on page 1-23
• “Multiple-Input and Multiple-Output Networks” on page 1-41
• “List of Deep Learning Layers” on page 1-43
• “Specify Layers of Convolutional Neural Network” on page 1-53
• “Set Up Parameters and Train Convolutional Neural Network” on page 1-64
• “Train Network with Numeric Features” on page 1-68
• “Train Network on Image and Feature Data” on page 1-74
• “Compare Activation Layers” on page 1-81
• “Deep Learning Tips and Tricks” on page 1-87
• “Long Short-Term Memory Networks” on page 1-97
1 Deep Networks

Deep Learning in MATLAB


In this section...
“What Is Deep Learning?” on page 1-2
“Start Deep Learning Faster Using Transfer Learning” on page 1-2
“Deep Learning Workflows” on page 1-3
“Deep Learning Apps” on page 1-5
“Train Classifiers Using Features Extracted from Pretrained Networks” on page 1-7
“Deep Learning with Big Data on CPUs, GPUs, in Parallel, and on the Cloud” on page 1-7
“Deep Learning Using Simulink” on page 1-7
“Deep Learning Interpretability” on page 1-8
“Deep Learning Customization” on page 1-8
“Deep Learning Import and Export” on page 1-9

What Is Deep Learning?


Deep learning is a branch of machine learning that teaches computers to do what comes naturally to
humans: learn from experience. Deep learning uses neural networks to learn useful representations
of features directly from data. Neural networks combine multiple nonlinear processing layers, using
simple elements operating in parallel and inspired by biological nervous systems. Deep learning
models can achieve state-of-the-art accuracy in object classification, sometimes exceeding human-
level performance.

Deep Learning Toolbox provides simple MATLAB commands for creating and interconnecting the
layers of a deep neural network. Examples and pretrained networks make it easy to use MATLAB for
deep learning, even without knowledge of advanced computer vision algorithms or neural networks.

For a free hands-on introduction to practical deep learning methods, see Deep Learning Onramp. To
quickly get started deep learning, see “Try Deep Learning in 10 Lines of MATLAB Code”.

Start Deep Learning Faster Using Transfer Learning


Transfer learning is commonly used in deep learning applications. You can take a pretrained network
and use it as a starting point to learn a new task. Fine-tuning a network with transfer learning is
much faster and easier than training from scratch. You can quickly make the network learn a new
task using a smaller number of training images. The advantage of transfer learning is that the
pretrained network has already learned a rich set of features that can be applied to a wide range of
other similar tasks. For an interactive example, see “Transfer Learning with Deep Network Designer”
on page 2-2. For a programmatic example, see “Train Deep Learning Network to Classify New
Images” on page 3-6.

To choose whether to use a pretrained network or create a new deep network, consider the scenarios
in this table.

1-2
Deep Learning in MATLAB

Use a Pretrained Network for Create a New Deep Network


Transfer Learning
Training Data Hundreds to thousands of Thousands to millions of labeled
labeled data (small) data
Computation Moderate computation (GPU Compute intensive (requires
optional) GPU for speed)
Training Time Seconds to minutes Days to weeks for real problems
Model Accuracy Good, depends on the High, but can overfit to small
pretrained model data sets

To explore a selection of pretrained networks, use Deep Network Designer.

Deep Learning Workflows


To learn more about deep learning application areas, see “Deep Learning Applications”.

Domain Example Workflow Learn More


Image Apply deep learning to image data “Get Started with Transfer
classificat tasks. Learning”
ion,
regressio For example, use deep learning for “Pretrained Deep Neural Networks”
n, and image classification and regression. on page 1-11
processin
g “Create Simple Deep Learning
Network for Classification” on page
3-43

“Train Convolutional Neural


Network for Regression” on page 3-
49

“Preprocess Images for Deep


Learning” on page 19-16
Sequence Apply deep learning to sequence “Sequence Classification Using
s and and time series tasks. Deep Learning” on page 4-3
time
series For example, use deep learning for “Time Series Forecasting Using
sequence classification and time Deep Learning” on page 4-16
series forecasting.
Computer Apply deep learning to computer “Getting Started with Semantic
vision vision applications. Segmentation Using Deep Learning”
(Computer Vision Toolbox)
For example, use deep learning for
semantic segmentation and object “Recognition, Object Detection, and
detection. Semantic Segmentation” (Computer
Vision Toolbox)

1-3
1 Deep Networks

Domain Example Workflow Learn More


Audio Apply deep learning to audio and “Audio Processing Using Deep
processin speech processing applications. Learning”
g
For example, use deep learning for “Deep Learning for Audio
speaker identification, speech Applications” (Audio Toolbox)
command recognition, and acoustic
scene recognition.
Automate Apply deep learning to automated “Automated Driving Using Deep
d driving driving applications. Learning”

For example, use deep learning for “Train a Deep Learning Vehicle
vehicle detection and semantic Detector” on page 10-2
segmentation.
Signal Apply deep learning to signal “Signal Processing Using Deep
processin processing applications. Learning”
g
For example, use deep learning for “Classify Time Series Using Wavelet
waveform segmentation, signal Analysis and Deep Learning” on
classification, and denoising speech page 12-77
signals.
Wireless Apply deep learning to wireless “Wireless Communications Using
communi communications systems. Deep Learning”
cations
For example, use deep learning for “Spectrum Sensing with Deep
positioning, spectrum sensing, Learning to Identify 5G and LTE
autoencoder design, and digital Signals” on page 13-85
predistortion (DPD).
“Three-Dimensional Indoor
Positioning with 802.11az
Fingerprinting and Deep Learning”
(WLAN Toolbox)
Reinforce Train deep neural network agents “Reinforcement Learning Using
ment by interacting with an unknown Deep Neural Networks”
learning dynamic environment.

For example, use reinforcement


learning to train policies to
implement controllers and decision-
making algorithms for complex
applications such as resource
allocation, robotics, and autonomous
systems.
Computat Apply deep learning to financial “Computational Finance Using Deep
ional workflows. Learning”
finance
For example, use deep learning for “Compare Deep Learning Networks
applications including instrument for Credit Default Prediction” on
pricing, trading, and risk page 17-2
management.

1-4
Deep Learning in MATLAB

Domain Example Workflow Learn More


Lidar Apply deep learning algorithms to “Lidar Processing Using Deep
processin process lidar point cloud data. Learning”
g
For example, use deep learning for “Aerial Lidar Semantic
semantic segmentation, object Segmentation Using PointNet++
detection on 3-D organized lidar Deep Learning” on page 11-25
point cloud data.
“Lidar 3-D Object Detection Using
PointPillars Deep Learning” on page
11-68
Text Apply deep learning algorithms to “Text Analytics Using Deep
analytics text analytics applications. Learning”

For example, use deep learning for “Classify Text Data Using Deep
text classification, language Learning” on page 4-193
translation, and text generation.
Predictive Apply deep learning to predictive “Predictive Maintenance Using
maintena maintenance applications. Deep Learning”
nce
For example, use deep learning for “Chemical Process Fault Detection
fault detection and remaining useful Using Deep Learning” on page 16-
life estimation. 2

Deep Learning Apps


Process data, visualize and train networks, track experiments, and quantize networks interactively
using apps.

You can process your data before training using apps to label ground truth data. For more
information on choosing a labeling app, see “Choose an App to Label Ground Truth Data” on page 19-
163.

Name Description Learn More


Deep Network Designer Build, visualize, edit, and “Transfer Learning with
train deep learning Deep Network Designer” on
networks. page 2-2

“Train Network for Time


Series Forecasting Using
Deep Network Designer” on
page 2-58
Experiment Manager Create deep learning “Create a Deep Learning
experiments to train Experiment for
networks under multiple Classification” on page 6-
initial conditions and 2
compare the results.
“Create a Deep Learning
Experiment for Regression”
on page 6-10

1-5
1 Deep Networks

Name Description Learn More


Deep Network Quantizer Reduce the memory “Quantization of Deep
requirement of a deep Neural Networks” on page
neural network by 20-208
quantizing weights, biases,
and activations of
convolution layers to 8-bit
scaled integer data types.
Reinforcement Learning Design, train, and simulate “Design and Train Agent
Designer reinforcement learning Using Reinforcement
agents. Learning Designer”
(Reinforcement Learning
Toolbox)

Image Labeler Label ground truth data in a “Get Started with the Image
collection of images. Labeler” (Computer Vision
Toolbox)

Video Labeler Label ground truth data in a “Get Started with the Video
video, in an image Labeler” (Computer Vision
sequence, or from a custom Toolbox)
data source reader.

Ground Truth Labeler Label ground truth data in “Get Started with Ground
multiple videos, image Truth Labelling”
sequences, or lidar point (Automated Driving
clouds. Toolbox)

Lidar Labeler Label objects in a point “Get Started with the Lidar
cloud or a point cloud Labeler” (Lidar Toolbox)
sequence. The app reads
point cloud data from PLY,
PCAP, LAS, LAZ, ROS and
PCD files.

Signal Labeler Label signals for analysis or “Using Signal Labeler App”
for use in machine learning (Signal Processing Toolbox)
and deep learning
applications.

1-6
Deep Learning in MATLAB

Train Classifiers Using Features Extracted from Pretrained Networks


Feature extraction allows you to use the power of pretrained networks without investing time and
effort into training. Feature extraction can be the fastest way to use deep learning. You extract
learned features from a pretrained network, and use those features to train a classifier, for example, a
support vector machine (SVM — requires Statistics and Machine Learning Toolbox™). For example, if
an SVM trained using alexnet can achieve >90% accuracy on your training and validation set, then
fine-tuning with transfer learning might not be worth the effort to gain some extra accuracy. If you
perform fine-tuning on a small dataset, then you also risk overfitting. If the SVM cannot achieve good
enough accuracy for your application, then fine-tuning is worth the effort to seek higher accuracy.

For an example, see “Extract Image Features Using Pretrained Network” on page 3-24.

Deep Learning with Big Data on CPUs, GPUs, in Parallel, and on the
Cloud
Training deep networks is computationally intensive and can take many hours of computing time;
however, neural networks are inherently parallel algorithms. You can use Parallel Computing
Toolbox™ to take advantage of this parallelism by running in parallel using high-performance GPUs
and computer clusters. To learn more about deep learning in parallel, in the cloud, or using a GPU,
see “Scale Up Deep Learning in Parallel, on GPUs, and in the Cloud” on page 7-2.

Datastores in MATLAB® are a convenient way of working with and representing collections of data
that are too large to fit in memory at one time. To learn more about deep learning with large data
sets, see “Deep Learning with Big Data” on page 7-18.

Deep Learning Using Simulink


Implement deep learning functionality in Simulink® models by using blocks from the Deep Neural
Networks block library, included in the Deep Learning Toolbox™, or by using the Deep Learning
Object Detector block from the Analysis & Enhancement block library included in the Computer
Vision Toolbox™.

For more information, see “Deep Learning with Simulink”.

Block Description
Image Classifier Classify data using a trained deep learning neural
network
Predict Predict responses using a trained deep learning
neural network

1-7
1 Deep Networks

Block Description
Stateful Classify Classify data using a trained deep learning
recurrent neural network
Stateful Predict Predict responses using a trained recurrent
neural network
Deep Learning Object Detector Detect objects using trained deep learning object
detector

Deep Learning Interpretability


Deep learning networks are often described as "black boxes" because the reason that a network
makes a certain decision is not always obvious. You can use interpretability techniques to translate
network behavior into output that a person can interpret. This interpretable output can then answer
questions about the predictions of a network.

The Deep Learning Toolbox provides several deep learning visualization methods to help you
investigate and understand network behaviour. For example, gradCAM, occlusionSensitivity,
and imageLIME. For more information, see “Deep Learning Visualization Methods” on page 5-233.

Deep Learning Customization


You can train and customize a deep learning model in various ways. For example, you can build a
network using built-in layers or define custom layers. You can then train your network using the built-
in training function trainNetwork or define a deep learning model as a function and use a custom
training loop. For help deciding which method to use, consult the following table.

Method Use Case Learn More


Built-in training and Suitable for most deep learning • “Create Simple Deep Learning
layers tasks. Network for Classification” on
page 3-43
• “Time Series Forecasting Using
Deep Learning” on page 4-16
• “List of Deep Learning Layers”
on page 1-43

1-8
Deep Learning in MATLAB

Method Use Case Learn More


Custom layers If Deep Learning Toolbox does not • “Define Custom Deep Learning
provide the layer you need for your Layers” on page 18-9
task, then you can create a custom • “Define Custom Deep Learning
layer. Intermediate Layers” on page
18-16
• “Define Custom Deep Learning
Output Layers” on page 18-31
Custom training loop If you need additional • “Define Custom Training Loops,
customization, you can build and Loss Functions, and Networks”
train your network using a custom on page 18-221
training loop. • “Define Deep Learning Network
for Custom Training Loops” on
page 18-221
• “Train Generative Adversarial
Network (GAN)” on page 3-72

For more information, see “Train Deep Learning Model in MATLAB” on page 18-3.

Deep Learning Import and Export


You can import networks and layer graphs from TensorFlow™ 2, TensorFlow-Keras, PyTorch®, and
the ONNX™ (Open Neural Network Exchange) model format. You can also export Deep Learning
Toolbox networks and layer graphs to TensorFlow 2 and the ONNX model format.

Import Functions

External Deep Learning Import Model as Network Import Model as Layer Graph
Platform and Model Format
TensorFlow network in importTensorFlowNetwork importTensorFlowLayers
SavedModel format
TensorFlow-Keras network in importKerasNetwork importKerasLayers
HDF5 or JSON format
traced PyTorch model in .pt file importNetworkFromPyTorch Not applicable
Network in ONNX model format importONNXNetwork importONNXLayers

The importTensorFlowNetwork and importTensorFlowLayers functions are recommended over


the importKerasNetwork and importKerasLayers functions. For more information, see
“Recommended Functions to Import TensorFlow Models” on page 18-454.

The importTensorFlowNetwork, importTensorFlowLayers, importNetworkFromPyTorch,


importONNXNetwork, and importONNXLayers functions create automatically generated custom
layers when you import a model with TensorFlow layers, PyTorch layers, or ONNX operators that the
functions cannot convert to built-in MATLAB layers. The functions save the automatically generated
custom layers to a package in the current folder. For more information, see “Autogenerated Custom
Layers” on page 18-455.

1-9
1 Deep Networks

Export Functions

Export Network or Layer Graph External Deep Learning Platform and Model
Format
exportNetworkToTensorFlow TensorFlow 2 model in Python® package
exportONNXNetwork ONNX model format

The exportNetworkToTensorFlow function saves a Deep Learning Toolbox network or layer graph
as a TensorFlow model in a Python package. For more information on how to load the exported model
and save it in a standard TensorFlow format, see “Load Exported TensorFlow Model” and “Save
Exported TensorFlow Model in Standard Format”.

By using ONNX as an intermediate format, you can interoperate with other deep learning frameworks
that support ONNX model export or import.

See Also

Related Examples
• “Classify Webcam Images Using Deep Learning” on page 3-2
• “Transfer Learning with Deep Network Designer” on page 2-2
• “Pretrained Deep Neural Networks” on page 1-11
• “Create Simple Deep Learning Network for Classification” on page 3-43
• “Example Deep Learning Networks Architectures” on page 1-23
• “Deep Learning Tips and Tricks” on page 1-87

1-10
Pretrained Deep Neural Networks

Pretrained Deep Neural Networks

In this section...
“Compare Pretrained Networks” on page 1-12
“Load Pretrained Networks” on page 1-13
“Visualize Pretrained Networks” on page 1-14
“Feature Extraction” on page 1-16
“Transfer Learning” on page 1-17
“Import and Export Networks” on page 1-17
“Pretrained Networks for Audio Applications” on page 1-18
“Pretrained Models on GitHub” on page 1-19

You can take a pretrained image classification network that has already learned to extract powerful
and informative features from natural images and use it as a starting point to learn a new task. The
majority of the pretrained networks are trained on a subset of the ImageNet database [1], which is
used in the ImageNet Large-Scale Visual Recognition Challenge (ILSVRC) [2]. These networks have
been trained on more than a million images and can classify images into 1000 object categories, such
as keyboard, coffee mug, pencil, and many animals. Using a pretrained network with transfer
learning is typically much faster and easier than training a network from scratch.

You can use previously trained networks for the following tasks:

Purpose Description
Classification Apply pretrained networks directly to
classification problems. To classify a new image,
use classify. For an example showing how to
use a pretrained network for classification, see
“Classify Image Using GoogLeNet” on page 3-
19.
Feature Extraction Use a pretrained network as a feature extractor
by using the layer activations as features. You can
use these activations as features to train another
machine learning model, such as a support vector
machine (SVM). For more information, see
“Feature Extraction” on page 1-16. For an
example, see “Extract Image Features Using
Pretrained Network” on page 3-24.
Transfer Learning Take layers from a network trained on a large
data set and fine-tune on a new data set. For
more information, see “Transfer Learning” on
page 1-17. For a simple example, see “Get
Started with Transfer Learning”. To try more
pretrained networks, see “Train Deep Learning
Network to Classify New Images” on page 3-6.

1-11
1 Deep Networks

Compare Pretrained Networks


Pretrained networks have different characteristics that matter when choosing a network to apply to
your problem. The most important characteristics are network accuracy, speed, and size. Choosing a
network is generally a tradeoff between these characteristics. Use the plot below to compare the
ImageNet validation accuracy with the time required to make a prediction using the network.

Tip To get started with transfer learning, try choosing one of the faster networks, such as
SqueezeNet or GoogLeNet. You can then iterate quickly and try out different settings such as data
preprocessing steps and training options. Once you have a feeling of which settings work well, try a
more accurate network such as Inception-v3 or a ResNet and see if that improves your results.

Note The plot above only shows an indication of the relative speeds of the different networks. The
exact prediction and training iteration times depend on the hardware and mini-batch size that you
use.

A good network has a high accuracy and is fast. The plot displays the classification accuracy versus
the prediction time when using a modern GPU (an NVIDIA® Tesla® P100) and a mini-batch size of
128. The prediction time is measured relative to the fastest network. The area of each marker is
proportional to the size of the network on disk.

The classification accuracy on the ImageNet validation set is the most common way to measure the
accuracy of networks trained on ImageNet. Networks that are accurate on ImageNet are also often
accurate when you apply them to other natural image data sets using transfer learning or feature

1-12
Pretrained Deep Neural Networks

extraction. This generalization is possible because the networks have learned to extract powerful and
informative features from natural images that generalize to other similar data sets. However, high
accuracy on ImageNet does not always transfer directly to other tasks, so it is a good idea to try
multiple networks.

If you want to perform prediction using constrained hardware or distribute networks over the
Internet, then also consider the size of the network on disk and in memory.

Network Accuracy

There are multiple ways to calculate the classification accuracy on the ImageNet validation set and
different sources use different methods. Sometimes an ensemble of multiple models is used and
sometimes each image is evaluated multiple times using multiple crops. Sometimes the top-5
accuracy instead of the standard (top-1) accuracy is quoted. Because of these differences, it is often
not possible to directly compare the accuracies from different sources. The accuracies of pretrained
networks in Deep Learning Toolbox are standard (top-1) accuracies using a single model and single
central image crop.

Load Pretrained Networks


To load the SqueezeNet network, type squeezenet at the command line.

net = squeezenet;

For other networks, use functions such as googlenet to get links to download pretrained networks
from the Add-On Explorer.

The following table lists the available pretrained networks trained on ImageNet and some of their
properties. The network depth is defined as the largest number of sequential convolutional or fully
connected layers on a path from the input layer to the output layer. The inputs to all networks are
RGB images.

Network Depth Size Parameters Image Input Size


(Millions)
squeezenet 18 5.2 MB 1.24 227-by-227
googlenet 22 27 MB 7.0 224-by-224
inceptionv3 48 89 MB 23.9 299-by-299
densenet201 201 77 MB 20.0 224-by-224
mobilenetv2 53 13 MB 3.5 224-by-224
resnet18 18 44 MB 11.7 224-by-224
resnet50 50 96 MB 25.6 224-by-224
resnet101 101 167 MB 44.6 224-by-224
xception 71 85 MB 22.9 299-by-299
inceptionresne 164 209 MB 55.9 299-by-299
tv2
shufflenet 50 5.4 MB 1.4 224-by-224
nasnetmobile * 20 MB 5.3 224-by-224

1-13
1 Deep Networks

Network Depth Size Parameters Image Input Size


(Millions)
nasnetlarge * 332 MB 88.9 331-by-331
darknet19 19 78 MB 20.8 256-by-256
darknet53 53 155 MB 41.6 256-by-256
efficientnetb0 82 20 MB 5.3 224-by-224
alexnet 8 227 MB 61.0 227-by-227
vgg16 16 515 MB 138 224-by-224
vgg19 19 535 MB 144 224-by-224

*The NASNet-Mobile and NASNet-Large networks do not consist of a linear sequence of modules.

GoogLeNet Trained on Places365

The standard GoogLeNet network is trained on the ImageNet data set but you can also load a
network trained on the Places365 data set [3] [4]. The network trained on Places365 classifies images
into 365 different place categories, such as field, park, runway, and lobby. To load a pretrained
GoogLeNet network trained on the Places365 data set, use
googlenet('Weights','places365'). When performing transfer learning to perform a new task,
the most common approach is to use networks pretrained on ImageNet. If the new task is similar to
classifying scenes, then using the network trained on Places365 could give higher accuracies.

For information about pretrained networks suitable for audio tasks, see “Pretrained Networks for
Audio Applications” on page 1-18.

Visualize Pretrained Networks


You can load and visualize pretrained networks using Deep Network Designer.

deepNetworkDesigner(squeezenet)

1-14
Random documents with unrelated
content Scribd suggests to you:
Every candle went out—but stars! As they did, the
great pink cake exploded with such force that half the
Courtiers were flung under the table and the rest
knocked unconscious by flying fragments of icing,
tumblers and plates.

“Treason!” screamed Pompus, the first to recover 26


from the shock. “Who dared put gunpowder in the
cake?” Brushing the icing from his nose, he glared
around angrily. The first person to catch his eye was
Hashem, the cook, who stood trembling in the doorway.

“Dip him!” shouted the King furiously. And the Chief


Dipper, only too glad of an excuse to escape, seized
poor Hashem. “And him!” ordered the King, as Eejabo
tried to sidle out of the room. “And them!” as all the
other footmen started to run. Forming his victims in a
line the Chief Dipper marched them sternly from the
banquet hall.

“Oyez! Oyez Everybody shall be dipped!” mumbled


the Prime Pumper, feebly raising his head.

“Oh, no! Oh, no! Nothing of the sort!” snapped the


King, fanning poor Queen Pozy Pink with a plate. She
had fainted dead away.

“What is the meaning of this outrage?” shouted


Pompus, his anger rising again.

“How should I know?” wheezed Kabumpo, dragging


Prince Pompadore from beneath the table and pouring a
jug of cream over his head.

“Something hit me,” moaned the Prince, opening his


eyes.

“Of course it did!” said Kabumpo. “The cake hit you. 27


Made a great hit with us all—that cake!” The Elegant
Elephant looked ruefully at his silk robe of state, which
was hopelessly smeared with icing; then put his trunk to
his head, for something hard had struck him between
the eyes. He felt about the floor and found a round
shiny object which he was about to show the King when
Pompus pounced upon a tall scroll sitting upright in his
tumbler. In the confusion of the moment it had escaped
his attention.

“Perhaps this will explain,” spluttered the King,


breaking the seal. Queen Pozy Pink opened her eyes
with a sigh, and the Courtiers, crawling out from
beneath the table, looked up anxiously, for everyone
was still dazed from the tremendous explosion. Pompus
read the scroll to himself with popping eyes and then
began to dance up and down in a frenzy.

“What is it? What is it?” cried the Queen, trying to


read over his shoulder. Then she gave a well-bred
scream and fainted away in the arms of General
Quakes, who had come up behind her.

By this time the Prime Pumper had recovered


sufficiently to remember that reading scrolls and court
papers was his business. Somewhat unsteadily he
walked over and took the scroll from the King.

“Oyez! Oyez!” he faltered, pounding on the table. 28

“Oh, never mind that!” rumbled Kabumpo, flagging


his ears. “Let’s hear what it says!”
“Know ye,” began the old man in a high, shaky voice,
“know ye that unless ye Prince of ye ancient and
honorable Kingdom of Pumperdink wed ye Proper Fairy
Princess in ye proper span of time ye Kingdom of
Pumperdink shall disappear forever and even longer
from ye Gilliken country of Oz.
J. G.”

“What?” screamed Pompadore, bounding to his feet.


“Me? But I don’t want to marry!”

“You’ll have to,” groaned the King, with a wave at the 29


scroll. The Courtiers sat staring at one another in dazed
disbelief. From the courtyard came the splash and
splutter of the luckless footmen and the dismal creaking
of the stone bucket.
“Oh!” wailed Pompa, throwing up his hands. “This is
the worst eighteenth birthday I’ve ever had. I’ll never
have another as long as I live!”

30
Chapter 2
Picking a Proper Princess
“What shall we do first?” groaned the King, holding
his head with both hands. “Let me think!”

“Right,” said Kabumpo. “Think by all means.”

So the great hall was cleared and the King, with the
mysterious scroll spread out before him, thought and
thought and thought. But he did not make much 31
headway, for, as he explained over and over to
Queen Pozy, who—with Pompadore, the Elegant
Elephant and the Prime Pumper—had remained to help
him, “How is one to know where to find the Proper
Princess, and how is one to know the proper time for
Pompa to wed her?”

Who was J.G.? How did the scroll get in the cake?

The more the King thought about these questions,


the more wrinkled his forehead became.

“Why! We’re liable to wake up any morning and find


ourselves gone,” he announced gloomily. “How does it
feel to disappear, I wonder?”

“I suppose it would give one rather a gone feeling,


but I don’t believe it would hurt—much!” volunteered
Kabumpo, glancing uneasily over his shoulder.
“Perhaps not, but it would not get us anywhere. My
idea is to marry the Prince at once to a Proper Princess,”
put in the Prime Pumper, “and avoid all this
disappearing.”

“You’re in a great hurry to marry me off, aren’t you,”


said Pompadore sulkily. “For my part, I don’t want to
marry at all!”

“Well, that’s very selfish of you, Pompa,” said the King


in a grieved voice. “Do you want your poor old father to
disappear?”

“Not only your poor old father,” choked the Prime 32


Pumper, rolling up his eyes. “How about me?”

“Oh, you—you can disappear any time you want,” said


the Prince unfeelingly.

“It all started with that wretched cake,” sighed the


Queen. “I am positive the scroll flew out of the cake
when it exploded.”

“Of course it did!” cried Pompus. “Let us send for the


cook and question him.”

So Hashem, very wet and blue from his dip, was


brought before the King.

“A fine cook you are!” roared Pompus, “mixing gun


powder and scrolls in a birthday cake.”

“But I didn’t,” wailed Hashem, falling on his knees.


“Only eggs, your Highness—very best eggs—sugar, flour,
spice and—”

“Bombshells!” cried the King angrily.


“The cake disappeared before the party, your
Majesty!” cried Eejabo.

Everyone jumped at the sudden interruption, and


Eejabo, who had crept in unnoticed, stepped before the
throne.

“Disappeared,” continued Eejabo hoarsely, dripping


blue water all over the royal rugs. “One minute there it
was on the pantry table. Next minute—gone!” croaked
Eejabo, flinging up his hands and shrugging his
shoulders.

“Then, before a fellow could turn around, it was back. 33


’Tweren’t our fault if magic got mixed into it, and here
we have been dipped for nothing!”

“Well, why didn’t you say so before!” asked the King


in exasperation.

“Fine chance I had to say anything!” sniffed Eejabo,


wringing out his lace ruffles.

“Eh—rr—you may have the day off, my good man,”


said Pompus, with an apologetic cough—“And you also,”
with a wave at Hashem. Very stiffly the two walked to
the door.

“It’s an off day for us, all right,” said Eejabo


ungraciously, and without so much as a bow the two
disappeared.

“I fear you were a bit hasty, my love,” murmured


Queen Pozy, looking after them with a troubled little
frown.
“Well, who wouldn’t be!” cried Pompus, ruffling up his
hair. “Here we are liable to disappear any minute and all
you do is to stand around and criticize me. Begone!” he
puffed angrily, as a page stuck his head in the door.

“No use shouting at people to begone,” said the 34


Elegant Elephant testily. “We’ll all begone soon enough.”

At this Queen Pozy began to weep into her silk


handkerchief, which sight so affected Prince Pompadore
that he rushed forward and embraced her tenderly.

“I’ll marry!” cried the Prince impulsively. “I’ll do


anything! The trouble is there aren’t any Fairy
Princesses around here!”

“There must be,” said the King.

“There is—There are!” screamed the Prime Pumper,


bouncing up suddenly. “Oyez, Oyez! Has your Majesty
forgotten Faleero, royal Princess of Follensby forest?”

“Why, of course!” The King snapped his fingers


joyfully. “Everyone says Faleero is a Fairy Princess. She
must be the proper one!”

“Fa—leero!” trumpeted the Elegant Elephant, sitting


down with a terrific thud. “That awful old creature! You
ought to be ashamed of yourself!”

“Silence!” thundered the King.

“Nonsense!” trumpeted Kabumpo. “She’s a thousand


years old and as ugly as a stone Lukoogoo. Don’t you
marry her, Pompa.”
“I command him to marry her!” cried the King 35
opening his eyes very wide and bending forward.

“Faleero?” gasped the Prince, scarcely believing his


ears. No wonder Pompadore was shocked. Faleero,
although a Princess in her own right and of royal fairy
descent, was so unattractive that in all her thousand
years of life no one had wished to marry her. She lived
in a small hut in the great forest kingdom next to
Pumperdink and did nothing all day but gather faggots.
Her face was long and lean, her hair thin and black and
her nose so large that it made you think of a cauliflower.

“Ugh!” groaned Prince Pompadore, falling back on 36


Kabumpo for support.
“Well, she’s a Princess and a fairy—the only one in
any Kingdom. I don’t see why you want to be so fussy!”
said the King fretfully.

“Shall I tell her Royal Highness of the great good


fortune that has befallen her?” asked the Prime Pumper,
starting for the door.

“Do so at once,” snapped Pompus. Just then he gave


a scream of fright and pain, for a round shiny object
had flown through the air and struck him on the head.
“What was that?”

The Prime Pumper looked suspiciously at the Elegant


Elephant. Kabumpo glared back.

“A—a warning!” stuttered the Prime Pumper, afraid to


say that Kabumpo had flung the offending missile. “A
warning, your Majesty!”

“It’s nothing of the kind,” said the King angrily. “You’re


getting old, Pumper and stupid. It’s—why it’s a door
knob! Who dares to hit me with a door knob?”

“It hit me once,” mumbled Kabumpo, shifting uneasily


from one foot to the other three. “How does it strike
you?”

“As an outrageous piece of impertinence!” spluttered 37


Pompus, turning as red as a turkey cock.

“Perhaps it has something to do with the scroll,”


suggested Queen Pozy, taking it from the King. “See! It
is gold and all the door knobs in the palace are ivory.
And look! Here are some initials!”
Sure enough! It was gold and in the very centre were
the initials P. A.

Just at this interesting juncture the page, who had


been poking his head in the door every few minutes,
gathered his courage together and rushed up to the
King.

“Pardon, Most High Highness, but General Quakes


bade me say that this mirror was found under the
window,” stuttered the page, and before Pompus had an
opportunity to cry “Begone!” or “Dip him!” the little
fellow made a dash for the door and disappeared.

“It grows more puzzling every minute,” wailed the


King, looking from the door knob to the mirror and from
the mirror to the scroll.

“If you take my advice you’ll have this marriage


performed at once,” said the Prime Pumper in a
trembling voice.

“I believe I will!” sighed Pompus, rubbing the bump


on his head. “Go and fetch the Princess Faleero and
you, Pompa, prepare for your wedding.”

“But Father!” began the Prince. 38

“Not another word or you’ll be dipped!” rumbled the


King of Pumperdink. “I’m not going to have my kingdom
disappearing if I can help it!”

“You mean if I can help it,” muttered Pompadore


gloomily.

“This is ridiculous!” stormed the Elegant Elephant, as


the Prime Pumper rushed importantly out of the room.
“Don’t you know that this country of ours is only a small
part of the great Kingdom of Oz? There must be
hundreds of Princesses for Pompadore to choose from.
Why should he not wed Ozma, the princess of us all?
Haven’t you read any Oz history? Have you never heard
of the wonderful Emerald City? Let Pompadore start out
at once. I, myself, will accompany him, and if Ozma
refuses to marry him—well”—the Elegant Elephant drew
himself up—“I will carry her off—that’s all!”

“It’s a long way to the Emerald City,” mused Queen


Pozy, “but still—”

“Yes, and what is to become of us in the meantime


pray? While you are wandering all over Oz we can
disappear I suppose! No Sir! Not one step do you go out
of Pumperdink. Faleero is the Proper Princess and
Pompadore shall marry her!” said Pompus.

“You’re talking through your crown,” wheezed 39


Kabumpo. “How about the door knob and mirror? They
came out of the cake as well as the scroll. What are you
going to do about them? Let’s have a look at that
mirror.”

“Just a common gold mirror,” fumed Pompus, holding


it up for the Elegant Elephant to see.

“What’s the matter?” as Kabumpo gave a snort.

On the face of the mirror, as Kabumpo looked in, two


words appeared:

Elegant Elephant.
And when Pompus snatched the mirror, above his
reflection stood the words:

Fat Old King.

Then Queen Pozy peeped into the mirror, which


promptly flashed:

Lovely Queen.

“Why, it’s telling the truth!” screamed Pompa, looking 40


over his mother’s shoulder. At this the words “Charming
Prince” formed quickly in the glass.

The Prince grinned at his father, who was now quite


beside himself with rage.

“You think I’m fat and old, do you!” snorted the King,
flinging the gold mirror face down on the table. “This is
a nice day, I must say! Scrolls, door knobs, mirrors and
insults!”

“But what can P. A. stand for?” mused Queen Pozy


thoughtfully.

“Plain enough,” chuckled Kabumpo, maliciously. “It


stands for perfectly awful!”

“Who’s perfectly awful?” asked Pompus suspiciously.


“Why, Faleero,” sniffed the Elegant Elephant. “That’s
plain enough to everybody!”

“Dip him!” shrieked Pompus. “I’ve had enough of this!


Dip him—do you hear?”

“That,” yawned Kabumpo, straightening his silk robe,


“is impossible!” And, considering his size it was. But just
that minute the Prime Pumper returned and in his
interest to hear what the Princess Faleero had said the
King forgot about dipping Kabumpo.

The courier from the Princess stepped forward.

“Her Highness,” puffed the Prime Pumper, who had 41


run all the way, “Her Highness accepts Prince
Pompadore with pleasure and will marry him to-morrow
morning.”

Prince Pompadore gave a dismal groan.

“Fine!” cried the King, rubbing his hands together.


“Let everything be made ready for the ceremony, and in
the meantime”—Pompus glared about fiercely—“I forbid
anyone’s disappearing. I am still the King! Set a guard
around the castle, Pumper, to watch for any signs of
disappearance, and if so much as a fence paling
disappears”—he drew himself up—“notify me at once!”
Then turning to the throne Pompus gave his arm to
Queen Pozy and together they started for the garden.

“Do you mean to say you are going to pay no


attention to the mirror or door knob?” cried Kabumpo,
planting himself in the King’s path.

“Go away,” said Pompus crossly.


“Oyez! Oyez! Way for their Majesties!” cried the Prime
Pumper, running ahead with his silver staff, and the
royal couple swept out of the banquet hall.

“Never mind, Kabumpo,” said the Prince, flinging his


arm affectionately around the Elegant Elephant’s trunk,
“I dare say Faleero has her good points—and we cannot
let the old Kingdom disappear, you know!”

42
“Flinging his arms affectionately around the Elegant
Elephant’s trunk”

43
“Fiddlesticks!” choked Kabumpo. “She’ll make a door
mat of you, Pompa—Prince Pompadormat—that’s what
you’ll be! Let’s run away!” he proposed, his little eyes
twinkling anxiously.

“I couldn’t do that and let the Kingdom disappear, it


wouldn’t be right,” sighed the Prince, and sadly he
followed his parents into the royal gardens.

“The King’s a Gooch!” gulped the Elegant Elephant


unhappily. Then, all at once he flung up his trunk.
“Somebody’s going to disappear around here,” he
wheezed darkly, “that’s certain!” With a mighty rustling
of his silk robe, Kabumpo hurried off to his own royal
quarters in the palace.

Left alone, Prince Pompa threw himself down at the


foot of the throne, and gazed sadly into space.

44
Chapter 3
Kabumpo and Pompa Disappear
Once in his own apartment, Kabumpo pulled the bell
rope furiously.

“My pearls and my purple plush robe! Bring them at


once!” he puffed when his personal attendant appeared
in the doorway.

“Yes, Sir! Are you going out, Sir?” murmured the little 45
Pumperdinkian, hastening to a great chest in the corner
of the big marble room, to get out of the robe.

“Not unless disappearing is going out,” said Kabumpo


more mildly, for he was quite fond of this little man who
waited on him. “But I’m liable to disappear any minute.
So are you. So is everybody, and I, for my part, wish to
do the thing well and disappear with as much elegance
as possible. Have you heard about the magic scroll,
Spezzle?”

“Yes, Sir!” quavered Spezzle, mounting a ladder to


adjust the Elegant Elephant’s pearls and gorgeous robe
of state. “Yes, Sir, and my head’s going round and round
like—”

“Like what?” asked Kabumpo, looking approvingly at


his reflection in the long mirror.
“I can’t rightly say, Sir,” sighed Spezzle. “This
disappearing has me that mixed up I don’t know what
I’m doing.”

“Well, don’t start by losing your head,” chuckled


Kabumpo. “There—that will do very well.” He lifted the
little man down from the ladder.

“Good-bye, Spezzle. If you should disappear before I


should see you again, try to do it in style.”

“Yes, Sir!” gulped Spezzle. Then taking out a bright


red handkerchief he blew his nose violently and rushed
out of the room.

Kabumpo walked up and down before the mirror, 46


surveying himself from all angles. A very gorgeous
appearance he presented, in his purple plush robe of
state, all embroidered in silver, and his head bands of
shining pearls. In the left side of his robe there was a
deep pocket. Into this the Elegant Elephant slipped all
the jewels he possessed, taking them from a drawer in
the chest.

“I must get that gold door knob,” he rumbled


thoughtfully. “And the mirror.” Noiselessly (for all his
tremendous size, Kabumpo could move without a
sound) he made his way back to the banquet hall and
loomed up suddenly behind the Prime Pumper. The old
fellow was staring with popping eyes into the gold
mirror.

“Ho, Ho!” roared Kabumpo. “Ho, Ho! Kerumph!”

No wonder! Above the shocked reflection of the


foolish statesman stood the words “Old Goose!”
Welcome to our website – the perfect destination for book lovers and
knowledge seekers. We believe that every book holds a new world,
offering opportunities for learning, discovery, and personal growth.
That’s why we are dedicated to bringing you a diverse collection of
books, ranging from classic literature and specialized publications to
self-development guides and children's books.

More than just a book-buying platform, we strive to be a bridge


connecting you with timeless cultural and intellectual values. With an
elegant, user-friendly interface and a smart search system, you can
quickly find the books that best suit your interests. Additionally,
our special promotions and home delivery services help you save time
and fully enjoy the joy of reading.

Join us on a journey of knowledge exploration, passion nurturing, and


personal growth every day!

ebookbell.com

You might also like