Chapter 06 - in class
Chapter 06 - in class
Chapter 6
Deep Learning and Cognitive
Computing
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Learning Objectives (1 of 2)
6.1 Learn what deep learning is and how it is changing the
world of computing
6.2 Know the placement of deep learning within the broad
family of AI learning methods
6.3 Understand how traditional “shallow” artificial neural
networks (ANN) work
6.4 Become familiar with the development and learning
processes of ANN
6.5 Develop an understanding of the methods to shed light
into the ANN black box
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Learning Objectives (2 of 2)
6.6 Know the underlying concept and methods for deep
neural networks
6.7 Become familiar with different types of deep learning
methods
6.8 Understand how convolutional neural networks (CNN),
recurrent neural networks (RNN), and long short-
memory networks (LSTM) work
6.9 Become familiar with the computer frameworks for
implementing deep learning
6.10 Know the foundational details about cognitive
computing
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Opening Vignette (1 of 4)
Fighting Fraud with Deep Learning and Artificial
Intelligence
• Business problem
– Danske Bank
– Predictive analytics in banking
Fraud detection (lots of false positive)
• The solution
– Deep learning
• The results
– 60% reduction in false positive
– 50% increase in true positive
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Opening Vignette (2 of 4)
Fighting Fraud with Deep Learning and Artificial
Intelligence
• Accuracy
– ROC curve
• DL vs traditional ML
techniques
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Opening Vignette (4 of 4)
Discussion Questions:
1. What is fraud in banking?
2. What are the types of fraud that banking firms are facing
today?
3. What do you think are the implications of fraud on banks
and on their customers?
4. Compare the old and new methods for identifying and
mitigating fraud.
5. Why do you think deep learning methods provided better
prediction accuracy?
6. Discuss the trade-off between false positive and false
negative (type 1 and type 2 errors) within the context of
predicting fraudulent activities.
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Introduction to Deep Learning
• Deep learning is the newest member of the AI/Machine
Learning family
– Image recognition
– Speech recognition…
• Difference from classic methods
– Classic: structured and relevant data provided by
human
– Deep learning: automatic feature/data representation
and extraction
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Introduction to Deep Learning
• Differences between Classic Machine-Learning Methods
and Representation Learning/Deep Learning
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Introduction to Deep Learning
• The placement of Deep Learning within the overarching
AI-based learning methods
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Basics of “Shallow” Learning (1 of 4)
• Artificial Neural Networks
• Neurons = Processing Elements (PEs)
• Single-input and single-output neuron/PE
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Basics of “Shallow” Learning (2 of 4)
• Common transfer (activation) functions
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Basics of “Shallow” Learning (3 of 4)
• Typical multiple-input neuron with R individual inputs
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Technology Insight 6.1
Elements of an Artificial Neural Network
• Processing element (PE)
• Network structure
– Hidden layer(s)
• Input
• Output
• Connection weights
• Summation function
• Transfer/Activation function
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Technology Insight 6.1
Elements of an Artificial Neural Network
• Neural Network with One Hidden Layer
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Technology Insight 6.1
Elements of an Artificial Neural Network
Summation Functions
Transfer Function
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Learning Process in ANN
1. Compute temporary outputs.
2. Compare outputs with
desired targets.
3. Adjust the weights and
repeat the process.
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Backpropagation for ANN Training
1. Initialize weights with random values
2. Read in the input vector and the desired output
3. Compute the actual output via the calculations
4. Compute the error
5. Change the weights by working backward
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Backpropagation for ANN Training
• Illustration of the overfitting in ANN
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Process of Developing Neural-Network Based
Systems
• constant feedbacks for changes and improvements!
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Illuminating the Black Box of ANN
• ANN are typically known as black boxes
• Sensitivity analysis can shed light to the black-box
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Deep Neural Networks (1 of 3)
• Deep: more hidden layers
• In addition to CPU, it also uses GPU
– Computational possibility
– More Advanced Processors from Nvidia
• Needs large datasets
• Deep learning uses tensors as inputs
– Tensor: N-dimensional arrays
– Image representation with 3-D tensors
• There are different types and capabilities of Deep Neural
Networks for different tasks/purposes
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Deep Neural Networks (2 of 3)
Feedforward Multilayer Perceptron (MLP)-Type Deep
Networks
• Most common type of deep networks
• Vector Representation of the First Three Layers in a
Typical MLP Network.
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Deep Neural Networks (3 of 3)
• Impact of Random
Weights in Deep MLP
• The Effect of Pre-
training Network
Parameters on
Improving Results of a
Classification-Type
Deep Neural Network.
• More hidden layers
versus more neurons?
– More layers with
fewer neurons
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Convolutional “Deep” Neural
Networks
• CNN: Most popular MLP-base DL method
• Used for image/video processing, facial/text recognition
• Has at least one convolution weight function
– Convolutional layer through a filter/kernel
– Polling (sub-sampling) layer
• Motivation
– Consolidating the large tensors into one with a smaller
size
– Reducing the number of model parameters while
keeping only the important features
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Convolution Function
• Typical Convolutional Network Unit
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Image Processing Using CNN
• ImageNet (https://image-net.org/)
– Object Localization
• Architecture of AlexNet, a CNN for Image Classification
– full article
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Image Processing Using CNN
• Examples of Using the Google Lens
Figure 6.28 Two Examples of Using the Google Lens, a Service Based
on Convolutional Deep Networks for Image Recognition.
Source: ©2018 Google LLC, used with permission. Google and the Google logo are
registered trademarks of Google LLC.
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Text Processing Using CNN
• Google word2vec project
– convert text corpus into word vector
– word embeddings: each word is a numeric vector
• Word Embeddings in a Two-Dimensional Space
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Recurrent Neural Networks (RNN)
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Long Short-Term Memory (LSTM)
Typical Long
Short-Term
Memory (L S T M)
Network
Architecture
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Long Short-Term Memory (LSTM)
Example Indicating
the Close-to-
Human
Performance of the
Google Neural
Machine Translator
(G N M T)
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Computer Frameworks for
Implementation of Deep Learning
• Torch (http://www.torch.ch)
– ML with GPU
• Caffe (caffe.berkeleyvision.org)
– Facebook’s improved version (www.caffe2.ai)
• TensorFlow (www.tensorflow.org)
– Google - Tensor Processing Units (TPUs)
• Theano (deeplearning.net/software/theano)
– Deep Learning Group at the University of Montreal
• Keras (keras.io)
– Application Programming Interface
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Cognitive Computing
• Use mathematical models to emulate (or partially
simulate) the human cognition process
• IBM Watson on Jeopardy!
• How does cognitive computing work?
– Adaptive
– Interactive
– Iterative
– Contextual
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Conceptual Framework for Cognitive
Computing and Its Promises
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Cognitive Search
• Can handle a variety of data types
• Can contextualize the search space
• Employ advanced AI technologies.
• Enable developers to build enterprise-specific search
applications
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved
Copyright
Copyright © 2020, 2015, 2011 Pearson Education, Inc. All Rights Reserved