This lesson is being piloted (Beta version)

Recap on ANN

Overview

Teaching: 40 min
Exercises: 0 min
Questions
  • Basic of ANN

Objectives
  • ANN

Recap on Artificial Neural Network (ANN)

Previous ANN in Machine Learning lecture can be found here: https://vuminhtue.github.io/Machine-Learning-Python/10-Neural-Network/index.html

image

Biological Neural Network

image

Machine Learning Neural Network (normally consists of 1 hidden layer: Shallow Network)

image

Deep Neural Network (multiple hidden layers)

Forward Propagation (Feed-forward)

Backpropgation

image

Activation Functions

Typical activation functions in Neural Network:

“*” Most popular functions

(1) Sigmoid Function

image

(2) Softmax

(3) Hyperbolic Tangent Function

image

(3) ReLU Function

image

Gradient problem

Gradient problem When training a deep neural network with gradient based learning and backpropagation, we find the partial derivatives by traversing the network from the the final layer to the initial layer. Using the chain rule, layers that are deeper into the network go through continuous matrix multiplications in order to compute their derivatives.

Vanishing gradient

Exploding gradient

Solution:

Solution

keras.layer.Dense(25, activation = "relu", kernel_initializer="he_normal")
keras.layers.BatchNormalization(),
optimizer = keras.optimizers.SGD(clipvalue = 1.0)

Key Points

  • ANN